00:00:00.001 Started by upstream project "autotest-per-patch" build number 126118 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.038 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.040 The recommended git tool is: git 00:00:00.040 using credential 00000000-0000-0000-0000-000000000002 00:00:00.041 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.056 Fetching changes from the remote Git repository 00:00:00.058 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.078 Using shallow fetch with depth 1 00:00:00.078 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.078 > git --version # timeout=10 00:00:00.095 > git --version # 'git version 2.39.2' 00:00:00.095 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.115 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.115 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.609 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.621 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.635 Checking out Revision 308e970df89ed396a3f9dcf22fba8891259694e4 (FETCH_HEAD) 00:00:03.635 > git config core.sparsecheckout # timeout=10 00:00:03.647 > git read-tree -mu HEAD # timeout=10 00:00:03.667 > git checkout -f 308e970df89ed396a3f9dcf22fba8891259694e4 # timeout=5 00:00:03.688 Commit message: "jjb/create-perf-report: make job run concurrent" 00:00:03.688 > git rev-list --no-walk 308e970df89ed396a3f9dcf22fba8891259694e4 # timeout=10 00:00:03.803 [Pipeline] Start of Pipeline 00:00:03.817 [Pipeline] library 00:00:03.819 Loading library shm_lib@master 00:00:03.819 Library shm_lib@master is cached. Copying from home. 00:00:03.833 [Pipeline] node 00:00:03.844 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:03.845 [Pipeline] { 00:00:03.855 [Pipeline] catchError 00:00:03.856 [Pipeline] { 00:00:03.870 [Pipeline] wrap 00:00:03.881 [Pipeline] { 00:00:03.887 [Pipeline] stage 00:00:03.889 [Pipeline] { (Prologue) 00:00:04.111 [Pipeline] sh 00:00:04.392 + logger -p user.info -t JENKINS-CI 00:00:04.409 [Pipeline] echo 00:00:04.410 Node: WFP50 00:00:04.417 [Pipeline] sh 00:00:04.710 [Pipeline] setCustomBuildProperty 00:00:04.720 [Pipeline] echo 00:00:04.721 Cleanup processes 00:00:04.726 [Pipeline] sh 00:00:05.007 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:05.007 272542 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:05.018 [Pipeline] sh 00:00:05.296 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:05.296 ++ grep -v 'sudo pgrep' 00:00:05.296 ++ awk '{print $1}' 00:00:05.296 + sudo kill -9 00:00:05.296 + true 00:00:05.311 [Pipeline] cleanWs 00:00:05.320 [WS-CLEANUP] Deleting project workspace... 00:00:05.320 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.326 [WS-CLEANUP] done 00:00:05.331 [Pipeline] setCustomBuildProperty 00:00:05.349 [Pipeline] sh 00:00:05.632 + sudo git config --global --replace-all safe.directory '*' 00:00:05.723 [Pipeline] httpRequest 00:00:05.754 [Pipeline] echo 00:00:05.755 Sorcerer 10.211.164.101 is alive 00:00:05.761 [Pipeline] httpRequest 00:00:05.765 HttpMethod: GET 00:00:05.766 URL: http://10.211.164.101/packages/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:05.767 Sending request to url: http://10.211.164.101/packages/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:05.785 Response Code: HTTP/1.1 200 OK 00:00:05.785 Success: Status code 200 is in the accepted range: 200,404 00:00:05.785 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:07.433 [Pipeline] sh 00:00:07.717 + tar --no-same-owner -xf jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:07.734 [Pipeline] httpRequest 00:00:07.759 [Pipeline] echo 00:00:07.761 Sorcerer 10.211.164.101 is alive 00:00:07.770 [Pipeline] httpRequest 00:00:07.774 HttpMethod: GET 00:00:07.774 URL: http://10.211.164.101/packages/spdk_a49cd26ae44b3f19a6e8cd55fbeebc7693572c46.tar.gz 00:00:07.775 Sending request to url: http://10.211.164.101/packages/spdk_a49cd26ae44b3f19a6e8cd55fbeebc7693572c46.tar.gz 00:00:07.786 Response Code: HTTP/1.1 200 OK 00:00:07.787 Success: Status code 200 is in the accepted range: 200,404 00:00:07.788 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_a49cd26ae44b3f19a6e8cd55fbeebc7693572c46.tar.gz 00:00:47.239 [Pipeline] sh 00:00:47.522 + tar --no-same-owner -xf spdk_a49cd26ae44b3f19a6e8cd55fbeebc7693572c46.tar.gz 00:00:51.722 [Pipeline] sh 00:00:52.001 + git -C spdk log --oneline -n5 00:00:52.001 a49cd26ae test/accel: parametrize accel tests for DSA kernel mode 00:00:52.001 9ba518f8f test/common/autotest_common: managing idxd drivers setup 00:00:52.001 4cfe5ece8 test/setup: add configuration script for dsa devices 00:00:52.001 719d03c6a sock/uring: only register net impl if supported 00:00:52.001 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:00:52.011 [Pipeline] } 00:00:52.026 [Pipeline] // stage 00:00:52.034 [Pipeline] stage 00:00:52.035 [Pipeline] { (Prepare) 00:00:52.049 [Pipeline] writeFile 00:00:52.061 [Pipeline] sh 00:00:52.339 + logger -p user.info -t JENKINS-CI 00:00:52.351 [Pipeline] sh 00:00:52.632 + logger -p user.info -t JENKINS-CI 00:00:52.646 [Pipeline] sh 00:00:52.930 + cat autorun-spdk.conf 00:00:52.930 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:52.930 SPDK_TEST_BLOCKDEV=1 00:00:52.930 SPDK_TEST_ISAL=1 00:00:52.930 SPDK_TEST_CRYPTO=1 00:00:52.930 SPDK_TEST_REDUCE=1 00:00:52.930 SPDK_TEST_VBDEV_COMPRESS=1 00:00:52.930 SPDK_RUN_UBSAN=1 00:00:52.937 RUN_NIGHTLY=0 00:00:52.943 [Pipeline] readFile 00:00:52.969 [Pipeline] withEnv 00:00:52.971 [Pipeline] { 00:00:52.984 [Pipeline] sh 00:00:53.268 + set -ex 00:00:53.268 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:53.268 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:53.268 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:53.268 ++ SPDK_TEST_BLOCKDEV=1 00:00:53.268 ++ SPDK_TEST_ISAL=1 00:00:53.268 ++ SPDK_TEST_CRYPTO=1 00:00:53.268 ++ SPDK_TEST_REDUCE=1 00:00:53.268 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:53.268 ++ SPDK_RUN_UBSAN=1 00:00:53.268 ++ RUN_NIGHTLY=0 00:00:53.268 + case $SPDK_TEST_NVMF_NICS in 00:00:53.268 + DRIVERS= 00:00:53.268 + [[ -n '' ]] 00:00:53.268 + exit 0 00:00:53.277 [Pipeline] } 00:00:53.298 [Pipeline] // withEnv 00:00:53.304 [Pipeline] } 00:00:53.325 [Pipeline] // stage 00:00:53.336 [Pipeline] catchError 00:00:53.338 [Pipeline] { 00:00:53.355 [Pipeline] timeout 00:00:53.355 Timeout set to expire in 40 min 00:00:53.358 [Pipeline] { 00:00:53.373 [Pipeline] stage 00:00:53.375 [Pipeline] { (Tests) 00:00:53.390 [Pipeline] sh 00:00:53.677 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:53.677 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:53.677 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:53.677 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:53.677 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:53.677 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:53.677 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:53.677 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:53.677 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:53.677 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:53.677 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:53.677 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:53.677 + source /etc/os-release 00:00:53.677 ++ NAME='Fedora Linux' 00:00:53.677 ++ VERSION='38 (Cloud Edition)' 00:00:53.677 ++ ID=fedora 00:00:53.677 ++ VERSION_ID=38 00:00:53.677 ++ VERSION_CODENAME= 00:00:53.677 ++ PLATFORM_ID=platform:f38 00:00:53.677 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:53.677 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:53.677 ++ LOGO=fedora-logo-icon 00:00:53.677 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:53.677 ++ HOME_URL=https://fedoraproject.org/ 00:00:53.677 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:53.677 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:53.677 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:53.677 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:53.677 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:53.677 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:53.677 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:53.677 ++ SUPPORT_END=2024-05-14 00:00:53.677 ++ VARIANT='Cloud Edition' 00:00:53.677 ++ VARIANT_ID=cloud 00:00:53.677 + uname -a 00:00:53.677 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:53.677 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:56.219 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:00:56.219 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:00:56.219 Hugepages 00:00:56.219 node hugesize free / total 00:00:56.219 node0 1048576kB 0 / 0 00:00:56.219 node0 2048kB 0 / 0 00:00:56.219 node1 1048576kB 0 / 0 00:00:56.219 node1 2048kB 0 / 0 00:00:56.219 00:00:56.219 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:56.219 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:56.219 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:56.219 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:56.219 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:56.219 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:56.219 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:56.219 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:56.219 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:56.219 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:00:56.219 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:56.219 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:56.219 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:56.219 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:56.478 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:56.478 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:56.478 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:56.478 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:56.478 VMD 0000:85:05.5 8086 201d 1 - - - 00:00:56.478 VMD 0000:d7:05.5 8086 201d 1 - - - 00:00:56.478 + rm -f /tmp/spdk-ld-path 00:00:56.478 + source autorun-spdk.conf 00:00:56.478 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:56.478 ++ SPDK_TEST_BLOCKDEV=1 00:00:56.478 ++ SPDK_TEST_ISAL=1 00:00:56.478 ++ SPDK_TEST_CRYPTO=1 00:00:56.478 ++ SPDK_TEST_REDUCE=1 00:00:56.478 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:56.478 ++ SPDK_RUN_UBSAN=1 00:00:56.478 ++ RUN_NIGHTLY=0 00:00:56.478 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:56.478 + [[ -n '' ]] 00:00:56.478 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:56.478 + for M in /var/spdk/build-*-manifest.txt 00:00:56.478 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:56.478 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:56.478 + for M in /var/spdk/build-*-manifest.txt 00:00:56.478 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:56.478 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:56.478 ++ uname 00:00:56.478 + [[ Linux == \L\i\n\u\x ]] 00:00:56.478 + sudo dmesg -T 00:00:56.478 + sudo dmesg --clear 00:00:56.478 + dmesg_pid=273415 00:00:56.478 + [[ Fedora Linux == FreeBSD ]] 00:00:56.478 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:56.478 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:56.478 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:56.478 + [[ -x /usr/src/fio-static/fio ]] 00:00:56.478 + sudo dmesg -Tw 00:00:56.478 + export FIO_BIN=/usr/src/fio-static/fio 00:00:56.478 + FIO_BIN=/usr/src/fio-static/fio 00:00:56.478 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:56.478 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:56.478 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:56.478 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:56.478 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:56.478 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:56.478 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:56.478 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:56.478 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:56.478 Test configuration: 00:00:56.478 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:56.478 SPDK_TEST_BLOCKDEV=1 00:00:56.478 SPDK_TEST_ISAL=1 00:00:56.478 SPDK_TEST_CRYPTO=1 00:00:56.478 SPDK_TEST_REDUCE=1 00:00:56.478 SPDK_TEST_VBDEV_COMPRESS=1 00:00:56.478 SPDK_RUN_UBSAN=1 00:00:56.478 RUN_NIGHTLY=0 13:26:45 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:56.478 13:26:45 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:56.478 13:26:45 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:56.478 13:26:45 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:56.478 13:26:45 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:56.478 13:26:45 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:56.478 13:26:45 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:56.478 13:26:45 -- paths/export.sh@5 -- $ export PATH 00:00:56.478 13:26:45 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:56.478 13:26:45 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:56.478 13:26:45 -- common/autobuild_common.sh@444 -- $ date +%s 00:00:56.478 13:26:45 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720783605.XXXXXX 00:00:56.736 13:26:45 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720783605.xhUIi2 00:00:56.736 13:26:45 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:00:56.736 13:26:45 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:00:56.736 13:26:45 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:56.736 13:26:45 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:56.736 13:26:45 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:56.736 13:26:45 -- common/autobuild_common.sh@460 -- $ get_config_params 00:00:56.736 13:26:45 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:00:56.736 13:26:45 -- common/autotest_common.sh@10 -- $ set +x 00:00:56.736 13:26:45 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:00:56.736 13:26:45 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:00:56.736 13:26:45 -- pm/common@17 -- $ local monitor 00:00:56.736 13:26:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:56.736 13:26:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:56.736 13:26:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:56.736 13:26:45 -- pm/common@21 -- $ date +%s 00:00:56.736 13:26:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:56.736 13:26:45 -- pm/common@21 -- $ date +%s 00:00:56.736 13:26:45 -- pm/common@25 -- $ sleep 1 00:00:56.736 13:26:45 -- pm/common@21 -- $ date +%s 00:00:56.736 13:26:45 -- pm/common@21 -- $ date +%s 00:00:56.736 13:26:45 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720783605 00:00:56.736 13:26:45 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720783605 00:00:56.736 13:26:45 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720783605 00:00:56.736 13:26:45 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720783605 00:00:56.736 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720783605_collect-vmstat.pm.log 00:00:56.736 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720783605_collect-cpu-load.pm.log 00:00:56.736 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720783605_collect-cpu-temp.pm.log 00:00:56.736 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720783605_collect-bmc-pm.bmc.pm.log 00:00:57.671 13:26:46 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:00:57.671 13:26:46 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:57.671 13:26:46 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:57.671 13:26:46 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:57.671 13:26:46 -- spdk/autobuild.sh@16 -- $ date -u 00:00:57.671 Fri Jul 12 11:26:46 AM UTC 2024 00:00:57.671 13:26:46 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:57.671 v24.09-pre-205-ga49cd26ae 00:00:57.671 13:26:46 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:57.671 13:26:46 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:57.671 13:26:46 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:57.671 13:26:46 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:00:57.671 13:26:46 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:00:57.671 13:26:46 -- common/autotest_common.sh@10 -- $ set +x 00:00:57.671 ************************************ 00:00:57.671 START TEST ubsan 00:00:57.671 ************************************ 00:00:57.671 13:26:46 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:00:57.671 using ubsan 00:00:57.671 00:00:57.671 real 0m0.001s 00:00:57.671 user 0m0.000s 00:00:57.671 sys 0m0.001s 00:00:57.671 13:26:46 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:00:57.671 13:26:46 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:57.671 ************************************ 00:00:57.671 END TEST ubsan 00:00:57.671 ************************************ 00:00:57.671 13:26:46 -- common/autotest_common.sh@1142 -- $ return 0 00:00:57.671 13:26:46 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:57.671 13:26:46 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:57.671 13:26:46 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:57.671 13:26:46 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:57.671 13:26:46 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:57.671 13:26:46 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:57.671 13:26:46 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:57.671 13:26:46 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:57.671 13:26:46 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:57.930 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:57.930 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:58.188 Using 'verbs' RDMA provider 00:01:14.451 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:29.334 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:29.335 Creating mk/config.mk...done. 00:01:29.335 Creating mk/cc.flags.mk...done. 00:01:29.335 Type 'make' to build. 00:01:29.335 13:27:17 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:29.335 13:27:17 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:29.335 13:27:17 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:29.335 13:27:17 -- common/autotest_common.sh@10 -- $ set +x 00:01:29.335 ************************************ 00:01:29.335 START TEST make 00:01:29.335 ************************************ 00:01:29.335 13:27:17 make -- common/autotest_common.sh@1123 -- $ make -j72 00:01:29.335 make[1]: Nothing to be done for 'all'. 00:02:16.061 The Meson build system 00:02:16.061 Version: 1.3.1 00:02:16.061 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:16.061 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:16.061 Build type: native build 00:02:16.061 Program cat found: YES (/usr/bin/cat) 00:02:16.061 Project name: DPDK 00:02:16.061 Project version: 24.03.0 00:02:16.061 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:16.061 C linker for the host machine: cc ld.bfd 2.39-16 00:02:16.061 Host machine cpu family: x86_64 00:02:16.061 Host machine cpu: x86_64 00:02:16.061 Message: ## Building in Developer Mode ## 00:02:16.061 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:16.061 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:16.061 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:16.061 Program python3 found: YES (/usr/bin/python3) 00:02:16.061 Program cat found: YES (/usr/bin/cat) 00:02:16.061 Compiler for C supports arguments -march=native: YES 00:02:16.061 Checking for size of "void *" : 8 00:02:16.061 Checking for size of "void *" : 8 (cached) 00:02:16.061 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:16.061 Library m found: YES 00:02:16.061 Library numa found: YES 00:02:16.061 Has header "numaif.h" : YES 00:02:16.061 Library fdt found: NO 00:02:16.061 Library execinfo found: NO 00:02:16.061 Has header "execinfo.h" : YES 00:02:16.061 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:16.061 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:16.061 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:16.061 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:16.061 Run-time dependency openssl found: YES 3.0.9 00:02:16.061 Run-time dependency libpcap found: YES 1.10.4 00:02:16.061 Has header "pcap.h" with dependency libpcap: YES 00:02:16.061 Compiler for C supports arguments -Wcast-qual: YES 00:02:16.061 Compiler for C supports arguments -Wdeprecated: YES 00:02:16.061 Compiler for C supports arguments -Wformat: YES 00:02:16.061 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:16.061 Compiler for C supports arguments -Wformat-security: NO 00:02:16.061 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:16.061 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:16.061 Compiler for C supports arguments -Wnested-externs: YES 00:02:16.061 Compiler for C supports arguments -Wold-style-definition: YES 00:02:16.061 Compiler for C supports arguments -Wpointer-arith: YES 00:02:16.061 Compiler for C supports arguments -Wsign-compare: YES 00:02:16.061 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:16.061 Compiler for C supports arguments -Wundef: YES 00:02:16.061 Compiler for C supports arguments -Wwrite-strings: YES 00:02:16.061 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:16.061 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:16.061 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:16.061 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:16.061 Program objdump found: YES (/usr/bin/objdump) 00:02:16.061 Compiler for C supports arguments -mavx512f: YES 00:02:16.061 Checking if "AVX512 checking" compiles: YES 00:02:16.061 Fetching value of define "__SSE4_2__" : 1 00:02:16.061 Fetching value of define "__AES__" : 1 00:02:16.061 Fetching value of define "__AVX__" : 1 00:02:16.061 Fetching value of define "__AVX2__" : 1 00:02:16.061 Fetching value of define "__AVX512BW__" : 1 00:02:16.061 Fetching value of define "__AVX512CD__" : 1 00:02:16.061 Fetching value of define "__AVX512DQ__" : 1 00:02:16.061 Fetching value of define "__AVX512F__" : 1 00:02:16.061 Fetching value of define "__AVX512VL__" : 1 00:02:16.061 Fetching value of define "__PCLMUL__" : 1 00:02:16.061 Fetching value of define "__RDRND__" : 1 00:02:16.061 Fetching value of define "__RDSEED__" : 1 00:02:16.061 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:16.061 Fetching value of define "__znver1__" : (undefined) 00:02:16.061 Fetching value of define "__znver2__" : (undefined) 00:02:16.061 Fetching value of define "__znver3__" : (undefined) 00:02:16.061 Fetching value of define "__znver4__" : (undefined) 00:02:16.061 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:16.061 Message: lib/log: Defining dependency "log" 00:02:16.061 Message: lib/kvargs: Defining dependency "kvargs" 00:02:16.061 Message: lib/telemetry: Defining dependency "telemetry" 00:02:16.061 Checking for function "getentropy" : NO 00:02:16.061 Message: lib/eal: Defining dependency "eal" 00:02:16.061 Message: lib/ring: Defining dependency "ring" 00:02:16.061 Message: lib/rcu: Defining dependency "rcu" 00:02:16.061 Message: lib/mempool: Defining dependency "mempool" 00:02:16.061 Message: lib/mbuf: Defining dependency "mbuf" 00:02:16.061 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:16.061 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:16.061 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:16.061 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:16.061 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:16.061 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:16.061 Compiler for C supports arguments -mpclmul: YES 00:02:16.061 Compiler for C supports arguments -maes: YES 00:02:16.061 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:16.061 Compiler for C supports arguments -mavx512bw: YES 00:02:16.061 Compiler for C supports arguments -mavx512dq: YES 00:02:16.061 Compiler for C supports arguments -mavx512vl: YES 00:02:16.061 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:16.061 Compiler for C supports arguments -mavx2: YES 00:02:16.061 Compiler for C supports arguments -mavx: YES 00:02:16.061 Message: lib/net: Defining dependency "net" 00:02:16.061 Message: lib/meter: Defining dependency "meter" 00:02:16.061 Message: lib/ethdev: Defining dependency "ethdev" 00:02:16.061 Message: lib/pci: Defining dependency "pci" 00:02:16.061 Message: lib/cmdline: Defining dependency "cmdline" 00:02:16.061 Message: lib/hash: Defining dependency "hash" 00:02:16.061 Message: lib/timer: Defining dependency "timer" 00:02:16.061 Message: lib/compressdev: Defining dependency "compressdev" 00:02:16.061 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:16.061 Message: lib/dmadev: Defining dependency "dmadev" 00:02:16.061 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:16.061 Message: lib/power: Defining dependency "power" 00:02:16.061 Message: lib/reorder: Defining dependency "reorder" 00:02:16.061 Message: lib/security: Defining dependency "security" 00:02:16.061 Has header "linux/userfaultfd.h" : YES 00:02:16.061 Has header "linux/vduse.h" : YES 00:02:16.061 Message: lib/vhost: Defining dependency "vhost" 00:02:16.061 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:16.061 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:16.061 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:16.061 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:16.061 Compiler for C supports arguments -std=c11: YES 00:02:16.061 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:16.061 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:16.061 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:16.061 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:16.061 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:16.061 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:16.061 Library mtcr_ul found: NO 00:02:16.061 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:16.061 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:16.061 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:16.061 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:16.061 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:16.061 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:16.061 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:16.061 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:16.061 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:16.062 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:16.062 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:16.062 Configuring mlx5_autoconf.h using configuration 00:02:16.062 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:16.062 Run-time dependency libcrypto found: YES 3.0.9 00:02:16.062 Library IPSec_MB found: YES 00:02:16.062 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:16.062 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:16.062 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:16.062 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:16.062 Library IPSec_MB found: YES 00:02:16.062 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:16.062 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:16.062 Compiler for C supports arguments -std=c11: YES (cached) 00:02:16.062 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:16.062 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:16.062 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:16.062 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:16.062 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:16.062 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:16.062 Library libisal found: NO 00:02:16.062 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:16.062 Compiler for C supports arguments -std=c11: YES (cached) 00:02:16.062 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:16.062 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:16.062 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:16.062 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:16.062 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:16.062 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:16.062 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:16.062 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:16.062 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:16.062 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:16.062 Program doxygen found: YES (/usr/bin/doxygen) 00:02:16.062 Configuring doxy-api-html.conf using configuration 00:02:16.062 Configuring doxy-api-man.conf using configuration 00:02:16.062 Program mandb found: YES (/usr/bin/mandb) 00:02:16.062 Program sphinx-build found: NO 00:02:16.062 Configuring rte_build_config.h using configuration 00:02:16.062 Message: 00:02:16.062 ================= 00:02:16.062 Applications Enabled 00:02:16.062 ================= 00:02:16.062 00:02:16.062 apps: 00:02:16.062 00:02:16.062 00:02:16.062 Message: 00:02:16.062 ================= 00:02:16.062 Libraries Enabled 00:02:16.062 ================= 00:02:16.062 00:02:16.062 libs: 00:02:16.062 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:16.062 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:16.062 cryptodev, dmadev, power, reorder, security, vhost, 00:02:16.062 00:02:16.062 Message: 00:02:16.062 =============== 00:02:16.062 Drivers Enabled 00:02:16.062 =============== 00:02:16.062 00:02:16.062 common: 00:02:16.062 mlx5, qat, 00:02:16.062 bus: 00:02:16.062 auxiliary, pci, vdev, 00:02:16.062 mempool: 00:02:16.062 ring, 00:02:16.062 dma: 00:02:16.062 00:02:16.062 net: 00:02:16.062 00:02:16.062 crypto: 00:02:16.062 ipsec_mb, mlx5, 00:02:16.062 compress: 00:02:16.062 isal, mlx5, 00:02:16.062 vdpa: 00:02:16.062 00:02:16.062 00:02:16.062 Message: 00:02:16.062 ================= 00:02:16.062 Content Skipped 00:02:16.062 ================= 00:02:16.062 00:02:16.062 apps: 00:02:16.062 dumpcap: explicitly disabled via build config 00:02:16.062 graph: explicitly disabled via build config 00:02:16.062 pdump: explicitly disabled via build config 00:02:16.063 proc-info: explicitly disabled via build config 00:02:16.063 test-acl: explicitly disabled via build config 00:02:16.063 test-bbdev: explicitly disabled via build config 00:02:16.063 test-cmdline: explicitly disabled via build config 00:02:16.063 test-compress-perf: explicitly disabled via build config 00:02:16.063 test-crypto-perf: explicitly disabled via build config 00:02:16.063 test-dma-perf: explicitly disabled via build config 00:02:16.063 test-eventdev: explicitly disabled via build config 00:02:16.063 test-fib: explicitly disabled via build config 00:02:16.063 test-flow-perf: explicitly disabled via build config 00:02:16.063 test-gpudev: explicitly disabled via build config 00:02:16.063 test-mldev: explicitly disabled via build config 00:02:16.063 test-pipeline: explicitly disabled via build config 00:02:16.063 test-pmd: explicitly disabled via build config 00:02:16.063 test-regex: explicitly disabled via build config 00:02:16.063 test-sad: explicitly disabled via build config 00:02:16.063 test-security-perf: explicitly disabled via build config 00:02:16.063 00:02:16.063 libs: 00:02:16.063 argparse: explicitly disabled via build config 00:02:16.063 metrics: explicitly disabled via build config 00:02:16.063 acl: explicitly disabled via build config 00:02:16.063 bbdev: explicitly disabled via build config 00:02:16.063 bitratestats: explicitly disabled via build config 00:02:16.063 bpf: explicitly disabled via build config 00:02:16.063 cfgfile: explicitly disabled via build config 00:02:16.063 distributor: explicitly disabled via build config 00:02:16.063 efd: explicitly disabled via build config 00:02:16.063 eventdev: explicitly disabled via build config 00:02:16.063 dispatcher: explicitly disabled via build config 00:02:16.063 gpudev: explicitly disabled via build config 00:02:16.063 gro: explicitly disabled via build config 00:02:16.063 gso: explicitly disabled via build config 00:02:16.063 ip_frag: explicitly disabled via build config 00:02:16.063 jobstats: explicitly disabled via build config 00:02:16.063 latencystats: explicitly disabled via build config 00:02:16.063 lpm: explicitly disabled via build config 00:02:16.063 member: explicitly disabled via build config 00:02:16.063 pcapng: explicitly disabled via build config 00:02:16.063 rawdev: explicitly disabled via build config 00:02:16.063 regexdev: explicitly disabled via build config 00:02:16.063 mldev: explicitly disabled via build config 00:02:16.063 rib: explicitly disabled via build config 00:02:16.063 sched: explicitly disabled via build config 00:02:16.063 stack: explicitly disabled via build config 00:02:16.063 ipsec: explicitly disabled via build config 00:02:16.063 pdcp: explicitly disabled via build config 00:02:16.063 fib: explicitly disabled via build config 00:02:16.063 port: explicitly disabled via build config 00:02:16.063 pdump: explicitly disabled via build config 00:02:16.063 table: explicitly disabled via build config 00:02:16.063 pipeline: explicitly disabled via build config 00:02:16.063 graph: explicitly disabled via build config 00:02:16.063 node: explicitly disabled via build config 00:02:16.063 00:02:16.063 drivers: 00:02:16.063 common/cpt: not in enabled drivers build config 00:02:16.063 common/dpaax: not in enabled drivers build config 00:02:16.063 common/iavf: not in enabled drivers build config 00:02:16.063 common/idpf: not in enabled drivers build config 00:02:16.063 common/ionic: not in enabled drivers build config 00:02:16.063 common/mvep: not in enabled drivers build config 00:02:16.063 common/octeontx: not in enabled drivers build config 00:02:16.063 bus/cdx: not in enabled drivers build config 00:02:16.063 bus/dpaa: not in enabled drivers build config 00:02:16.063 bus/fslmc: not in enabled drivers build config 00:02:16.063 bus/ifpga: not in enabled drivers build config 00:02:16.063 bus/platform: not in enabled drivers build config 00:02:16.063 bus/uacce: not in enabled drivers build config 00:02:16.063 bus/vmbus: not in enabled drivers build config 00:02:16.063 common/cnxk: not in enabled drivers build config 00:02:16.063 common/nfp: not in enabled drivers build config 00:02:16.063 common/nitrox: not in enabled drivers build config 00:02:16.063 common/sfc_efx: not in enabled drivers build config 00:02:16.063 mempool/bucket: not in enabled drivers build config 00:02:16.063 mempool/cnxk: not in enabled drivers build config 00:02:16.063 mempool/dpaa: not in enabled drivers build config 00:02:16.063 mempool/dpaa2: not in enabled drivers build config 00:02:16.063 mempool/octeontx: not in enabled drivers build config 00:02:16.063 mempool/stack: not in enabled drivers build config 00:02:16.063 dma/cnxk: not in enabled drivers build config 00:02:16.063 dma/dpaa: not in enabled drivers build config 00:02:16.063 dma/dpaa2: not in enabled drivers build config 00:02:16.063 dma/hisilicon: not in enabled drivers build config 00:02:16.063 dma/idxd: not in enabled drivers build config 00:02:16.063 dma/ioat: not in enabled drivers build config 00:02:16.063 dma/skeleton: not in enabled drivers build config 00:02:16.063 net/af_packet: not in enabled drivers build config 00:02:16.063 net/af_xdp: not in enabled drivers build config 00:02:16.063 net/ark: not in enabled drivers build config 00:02:16.063 net/atlantic: not in enabled drivers build config 00:02:16.063 net/avp: not in enabled drivers build config 00:02:16.063 net/axgbe: not in enabled drivers build config 00:02:16.063 net/bnx2x: not in enabled drivers build config 00:02:16.063 net/bnxt: not in enabled drivers build config 00:02:16.063 net/bonding: not in enabled drivers build config 00:02:16.063 net/cnxk: not in enabled drivers build config 00:02:16.063 net/cpfl: not in enabled drivers build config 00:02:16.063 net/cxgbe: not in enabled drivers build config 00:02:16.063 net/dpaa: not in enabled drivers build config 00:02:16.063 net/dpaa2: not in enabled drivers build config 00:02:16.063 net/e1000: not in enabled drivers build config 00:02:16.063 net/ena: not in enabled drivers build config 00:02:16.063 net/enetc: not in enabled drivers build config 00:02:16.063 net/enetfec: not in enabled drivers build config 00:02:16.063 net/enic: not in enabled drivers build config 00:02:16.063 net/failsafe: not in enabled drivers build config 00:02:16.063 net/fm10k: not in enabled drivers build config 00:02:16.063 net/gve: not in enabled drivers build config 00:02:16.063 net/hinic: not in enabled drivers build config 00:02:16.063 net/hns3: not in enabled drivers build config 00:02:16.063 net/i40e: not in enabled drivers build config 00:02:16.063 net/iavf: not in enabled drivers build config 00:02:16.063 net/ice: not in enabled drivers build config 00:02:16.063 net/idpf: not in enabled drivers build config 00:02:16.063 net/igc: not in enabled drivers build config 00:02:16.063 net/ionic: not in enabled drivers build config 00:02:16.063 net/ipn3ke: not in enabled drivers build config 00:02:16.063 net/ixgbe: not in enabled drivers build config 00:02:16.063 net/mana: not in enabled drivers build config 00:02:16.063 net/memif: not in enabled drivers build config 00:02:16.063 net/mlx4: not in enabled drivers build config 00:02:16.063 net/mlx5: not in enabled drivers build config 00:02:16.063 net/mvneta: not in enabled drivers build config 00:02:16.063 net/mvpp2: not in enabled drivers build config 00:02:16.063 net/netvsc: not in enabled drivers build config 00:02:16.063 net/nfb: not in enabled drivers build config 00:02:16.063 net/nfp: not in enabled drivers build config 00:02:16.063 net/ngbe: not in enabled drivers build config 00:02:16.063 net/null: not in enabled drivers build config 00:02:16.063 net/octeontx: not in enabled drivers build config 00:02:16.063 net/octeon_ep: not in enabled drivers build config 00:02:16.063 net/pcap: not in enabled drivers build config 00:02:16.063 net/pfe: not in enabled drivers build config 00:02:16.063 net/qede: not in enabled drivers build config 00:02:16.063 net/ring: not in enabled drivers build config 00:02:16.063 net/sfc: not in enabled drivers build config 00:02:16.063 net/softnic: not in enabled drivers build config 00:02:16.063 net/tap: not in enabled drivers build config 00:02:16.063 net/thunderx: not in enabled drivers build config 00:02:16.063 net/txgbe: not in enabled drivers build config 00:02:16.063 net/vdev_netvsc: not in enabled drivers build config 00:02:16.063 net/vhost: not in enabled drivers build config 00:02:16.063 net/virtio: not in enabled drivers build config 00:02:16.063 net/vmxnet3: not in enabled drivers build config 00:02:16.063 raw/*: missing internal dependency, "rawdev" 00:02:16.063 crypto/armv8: not in enabled drivers build config 00:02:16.063 crypto/bcmfs: not in enabled drivers build config 00:02:16.063 crypto/caam_jr: not in enabled drivers build config 00:02:16.063 crypto/ccp: not in enabled drivers build config 00:02:16.063 crypto/cnxk: not in enabled drivers build config 00:02:16.063 crypto/dpaa_sec: not in enabled drivers build config 00:02:16.063 crypto/dpaa2_sec: not in enabled drivers build config 00:02:16.063 crypto/mvsam: not in enabled drivers build config 00:02:16.063 crypto/nitrox: not in enabled drivers build config 00:02:16.063 crypto/null: not in enabled drivers build config 00:02:16.063 crypto/octeontx: not in enabled drivers build config 00:02:16.063 crypto/openssl: not in enabled drivers build config 00:02:16.063 crypto/scheduler: not in enabled drivers build config 00:02:16.063 crypto/uadk: not in enabled drivers build config 00:02:16.063 crypto/virtio: not in enabled drivers build config 00:02:16.063 compress/nitrox: not in enabled drivers build config 00:02:16.063 compress/octeontx: not in enabled drivers build config 00:02:16.063 compress/zlib: not in enabled drivers build config 00:02:16.063 regex/*: missing internal dependency, "regexdev" 00:02:16.063 ml/*: missing internal dependency, "mldev" 00:02:16.063 vdpa/ifc: not in enabled drivers build config 00:02:16.063 vdpa/mlx5: not in enabled drivers build config 00:02:16.063 vdpa/nfp: not in enabled drivers build config 00:02:16.063 vdpa/sfc: not in enabled drivers build config 00:02:16.063 event/*: missing internal dependency, "eventdev" 00:02:16.063 baseband/*: missing internal dependency, "bbdev" 00:02:16.063 gpu/*: missing internal dependency, "gpudev" 00:02:16.063 00:02:16.063 00:02:16.063 Build targets in project: 115 00:02:16.063 00:02:16.063 DPDK 24.03.0 00:02:16.063 00:02:16.063 User defined options 00:02:16.063 buildtype : debug 00:02:16.063 default_library : shared 00:02:16.063 libdir : lib 00:02:16.063 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:16.063 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:16.063 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:16.063 cpu_instruction_set: native 00:02:16.064 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:02:16.064 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,argparse,pcapng,bbdev 00:02:16.064 enable_docs : false 00:02:16.064 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:16.064 enable_kmods : false 00:02:16.064 max_lcores : 128 00:02:16.064 tests : false 00:02:16.064 00:02:16.064 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:16.064 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:16.064 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:16.064 [2/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:16.064 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:16.064 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:16.064 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:16.064 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:16.064 [7/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:16.064 [8/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:16.064 [9/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:16.064 [10/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:16.064 [11/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:16.064 [12/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:16.064 [13/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:16.064 [14/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:16.064 [15/378] Linking static target lib/librte_log.a 00:02:16.064 [16/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:16.064 [17/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:16.064 [18/378] Linking static target lib/librte_kvargs.a 00:02:16.064 [19/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:16.064 [20/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:16.064 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:16.064 [22/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:16.064 [23/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:16.064 [24/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:16.064 [25/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:16.064 [26/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:16.064 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:16.064 [28/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:16.064 [29/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:16.064 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:16.064 [31/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:16.064 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:16.064 [33/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:16.064 [34/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:16.064 [35/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:16.064 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:16.064 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:16.064 [38/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:16.064 [39/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:16.064 [40/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:16.064 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:16.064 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:16.064 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:16.064 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:16.064 [45/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:16.064 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:16.064 [47/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:16.064 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:16.064 [49/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:16.064 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:16.064 [51/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:16.064 [52/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:16.064 [53/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.064 [54/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:16.064 [55/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:16.064 [56/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:16.064 [57/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:16.064 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:16.064 [59/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:16.064 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:16.064 [61/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:16.064 [62/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:16.064 [63/378] Linking static target lib/librte_ring.a 00:02:16.064 [64/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:16.064 [65/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:16.064 [66/378] Linking static target lib/librte_telemetry.a 00:02:16.064 [67/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:16.064 [68/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:16.064 [69/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:16.064 [70/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:16.064 [71/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:16.064 [72/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:16.064 [73/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:16.064 [74/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:16.064 [75/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:16.064 [76/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:16.064 [77/378] Linking static target lib/librte_pci.a 00:02:16.064 [78/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:16.064 [79/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:16.064 [80/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:16.064 [81/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:16.064 [82/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:16.064 [83/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:16.064 [84/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:16.064 [85/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:16.064 [86/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:16.064 [87/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:16.064 [88/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:16.064 [89/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:16.064 [90/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:16.064 [91/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:16.064 [92/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:16.064 [93/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:16.064 [94/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:16.064 [95/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:16.064 [96/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:16.064 [97/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:16.064 [98/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:16.064 [99/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:16.064 [100/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:16.064 [101/378] Linking static target lib/librte_mempool.a 00:02:16.064 [102/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:16.064 [103/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:16.064 [104/378] Linking static target lib/librte_rcu.a 00:02:16.064 [105/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:16.064 [106/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.064 [107/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:16.064 [108/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:16.064 [109/378] Linking target lib/librte_log.so.24.1 00:02:16.064 [110/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:16.064 [111/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:16.064 [112/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:16.064 [113/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:16.064 [114/378] Linking static target lib/librte_meter.a 00:02:16.064 [115/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:16.064 [116/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.064 [117/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:16.064 [118/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:16.064 [119/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.064 [120/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:16.064 [121/378] Linking static target lib/librte_timer.a 00:02:16.064 [122/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:16.064 [123/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:16.064 [124/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:16.064 [125/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:16.064 [126/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:16.064 [127/378] Linking static target lib/librte_mbuf.a 00:02:16.064 [128/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:16.064 [129/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:16.064 [130/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:16.064 [131/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:16.064 [132/378] Linking target lib/librte_kvargs.so.24.1 00:02:16.065 [133/378] Linking static target lib/librte_cmdline.a 00:02:16.065 [134/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:16.065 [135/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:16.065 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:16.065 [137/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:16.065 [138/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:16.065 [139/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:16.065 [140/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:16.065 [141/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:16.065 [142/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:16.065 [143/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:16.065 [144/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:16.065 [145/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.065 [146/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:16.065 [147/378] Linking static target lib/librte_eal.a 00:02:16.065 [148/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:16.065 [149/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:16.065 [150/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:16.065 [151/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:16.065 [152/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:16.065 [153/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.065 [154/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.065 [155/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:16.065 [156/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:16.065 [157/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:16.065 [158/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:16.065 [159/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:16.065 [160/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:16.065 [161/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:16.065 [162/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:16.065 [163/378] Linking static target lib/librte_dmadev.a 00:02:16.331 [164/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:16.331 [165/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:16.331 [166/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:16.331 [167/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:16.331 [168/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:16.331 [169/378] Linking static target lib/librte_compressdev.a 00:02:16.331 [170/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:16.331 [171/378] Linking target lib/librte_telemetry.so.24.1 00:02:16.331 [172/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:16.331 [173/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:16.331 [174/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:16.331 [175/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:16.331 [176/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:16.331 [177/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:16.331 [178/378] Linking static target lib/librte_net.a 00:02:16.331 [179/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:16.331 [180/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:16.331 [181/378] Linking static target lib/librte_reorder.a 00:02:16.331 [182/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:16.331 [183/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:16.331 [184/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:16.331 [185/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:16.331 [186/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:16.331 [187/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:16.331 [188/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:16.594 [189/378] Linking static target lib/librte_security.a 00:02:16.594 [190/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:16.594 [191/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:16.594 [192/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:16.594 [193/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.594 [194/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:16.594 [195/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:16.594 [196/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:16.594 [197/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:16.594 [198/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:16.594 [199/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:16.594 [200/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:16.594 [201/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:16.594 [202/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:16.594 [203/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:16.594 [204/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:16.594 [205/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:16.594 [206/378] Linking static target lib/librte_hash.a 00:02:16.594 [207/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:16.854 [208/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:16.854 [209/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:16.854 [210/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:16.854 [211/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.854 [212/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:16.854 [213/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:16.854 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:16.854 [215/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:16.854 [216/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:16.854 [217/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:16.854 [218/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:16.854 [219/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:16.854 [220/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:16.854 [221/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.854 [222/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:16.854 [223/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:16.854 [224/378] Linking static target drivers/librte_bus_vdev.a 00:02:16.854 [225/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:16.854 [226/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:16.854 [227/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.854 [228/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:16.854 [229/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:16.854 [230/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:16.854 [231/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:16.854 [232/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:16.854 [233/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:16.854 [234/378] Linking static target lib/librte_power.a 00:02:16.854 [235/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:16.854 [236/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:16.854 [237/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:16.854 [238/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:16.854 [239/378] Linking static target drivers/librte_bus_pci.a 00:02:16.854 [240/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:16.854 [241/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:16.854 [242/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.854 [243/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:16.854 [244/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:16.854 [245/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:16.854 [246/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:16.854 [247/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:17.114 [248/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.114 [249/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.114 [250/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:17.114 [251/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:17.114 [252/378] Linking static target lib/librte_cryptodev.a 00:02:17.114 [253/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.114 [254/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:17.114 [255/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.114 [256/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:17.114 [257/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:17.114 [258/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:17.114 [259/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:17.114 [260/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:17.114 [261/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:17.114 [262/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:17.114 [263/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:17.114 [264/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.373 [265/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:17.373 [266/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:17.373 [267/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:17.373 [268/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:17.373 [269/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:17.373 [270/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:17.373 [271/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:17.373 [272/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:17.373 [273/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:17.373 [274/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.373 [275/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:17.373 [276/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:17.373 [277/378] Linking static target drivers/librte_mempool_ring.a 00:02:17.373 [278/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:17.373 [279/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:17.373 [280/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:17.373 [281/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:17.373 [282/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:17.373 [283/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:17.373 [284/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:17.373 [285/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:17.373 [286/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:17.631 [287/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:17.631 [288/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:17.631 [289/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:17.631 [290/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:17.631 [291/378] Linking static target drivers/librte_common_mlx5.a 00:02:17.631 [292/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:17.631 [293/378] Linking static target lib/librte_ethdev.a 00:02:17.631 [294/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.631 [295/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:17.631 [296/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:17.631 [297/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:17.631 [298/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:17.631 [299/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:17.631 [300/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.631 [301/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:17.631 [302/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:17.631 [303/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:17.631 [304/378] Linking static target drivers/librte_compress_isal.a 00:02:17.631 [305/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:17.631 [306/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:17.631 [307/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:17.632 [308/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:17.632 [309/378] Linking static target drivers/librte_compress_mlx5.a 00:02:17.632 [310/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:18.199 [311/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.457 [312/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:18.457 [313/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:18.457 [314/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:18.457 [315/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:18.715 [316/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:18.716 [317/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:18.716 [318/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:18.716 [319/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:18.716 [320/378] Linking static target drivers/librte_common_qat.a 00:02:18.716 [321/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:18.716 [322/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:18.716 [323/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:18.974 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:18.974 [325/378] Linking static target lib/librte_vhost.a 00:02:19.233 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.766 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.671 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.861 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.238 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.238 [331/378] Linking target lib/librte_eal.so.24.1 00:02:29.496 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:29.496 [333/378] Linking target lib/librte_dmadev.so.24.1 00:02:29.496 [334/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:29.496 [335/378] Linking target lib/librte_timer.so.24.1 00:02:29.496 [336/378] Linking target lib/librte_ring.so.24.1 00:02:29.496 [337/378] Linking target lib/librte_meter.so.24.1 00:02:29.496 [338/378] Linking target lib/librte_pci.so.24.1 00:02:29.496 [339/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:29.755 [340/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:29.755 [341/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:29.755 [342/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:29.755 [343/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:29.755 [344/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:29.755 [345/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:29.755 [346/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:29.755 [347/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:29.755 [348/378] Linking target lib/librte_mempool.so.24.1 00:02:29.755 [349/378] Linking target lib/librte_rcu.so.24.1 00:02:30.013 [350/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:30.014 [351/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:30.014 [352/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:30.014 [353/378] Linking target lib/librte_mbuf.so.24.1 00:02:30.014 [354/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:30.271 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:30.271 [356/378] Linking target lib/librte_net.so.24.1 00:02:30.271 [357/378] Linking target lib/librte_compressdev.so.24.1 00:02:30.271 [358/378] Linking target lib/librte_reorder.so.24.1 00:02:30.271 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:02:30.530 [360/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:30.530 [361/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:30.530 [362/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:30.530 [363/378] Linking target lib/librte_hash.so.24.1 00:02:30.530 [364/378] Linking target lib/librte_cmdline.so.24.1 00:02:30.530 [365/378] Linking target lib/librte_security.so.24.1 00:02:30.530 [366/378] Linking target lib/librte_ethdev.so.24.1 00:02:30.530 [367/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:30.530 [368/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:30.530 [369/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:30.789 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:30.789 [371/378] Linking target lib/librte_power.so.24.1 00:02:30.789 [372/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:30.789 [373/378] Linking target lib/librte_vhost.so.24.1 00:02:30.789 [374/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:30.789 [375/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:31.048 [376/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:31.048 [377/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:31.048 [378/378] Linking target drivers/librte_common_qat.so.24.1 00:02:31.048 INFO: autodetecting backend as ninja 00:02:31.048 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:31.985 CC lib/log/log.o 00:02:31.985 CC lib/log/log_deprecated.o 00:02:31.985 CC lib/log/log_flags.o 00:02:32.244 CC lib/ut_mock/mock.o 00:02:32.244 CC lib/ut/ut.o 00:02:32.244 LIB libspdk_log.a 00:02:32.244 LIB libspdk_ut_mock.a 00:02:32.244 LIB libspdk_ut.a 00:02:32.244 SO libspdk_log.so.7.0 00:02:32.503 SO libspdk_ut.so.2.0 00:02:32.503 SO libspdk_ut_mock.so.6.0 00:02:32.503 SYMLINK libspdk_log.so 00:02:32.503 SYMLINK libspdk_ut.so 00:02:32.503 SYMLINK libspdk_ut_mock.so 00:02:32.762 CC lib/ioat/ioat.o 00:02:32.762 CC lib/dma/dma.o 00:02:32.762 CC lib/util/base64.o 00:02:32.762 CC lib/util/bit_array.o 00:02:32.762 CC lib/util/crc16.o 00:02:32.762 CC lib/util/cpuset.o 00:02:32.762 CC lib/util/crc32.o 00:02:32.762 CC lib/util/crc32c.o 00:02:32.762 CXX lib/trace_parser/trace.o 00:02:32.762 CC lib/util/crc32_ieee.o 00:02:32.762 CC lib/util/crc64.o 00:02:32.762 CC lib/util/dif.o 00:02:32.762 CC lib/util/fd.o 00:02:32.762 CC lib/util/file.o 00:02:32.762 CC lib/util/hexlify.o 00:02:32.762 CC lib/util/math.o 00:02:32.762 CC lib/util/iov.o 00:02:32.762 CC lib/util/pipe.o 00:02:32.762 CC lib/util/strerror_tls.o 00:02:32.762 CC lib/util/string.o 00:02:32.762 CC lib/util/uuid.o 00:02:32.762 CC lib/util/fd_group.o 00:02:32.762 CC lib/util/zipf.o 00:02:32.762 CC lib/util/xor.o 00:02:33.020 CC lib/vfio_user/host/vfio_user.o 00:02:33.020 CC lib/vfio_user/host/vfio_user_pci.o 00:02:33.020 LIB libspdk_dma.a 00:02:33.020 SO libspdk_dma.so.4.0 00:02:33.020 LIB libspdk_ioat.a 00:02:33.279 SO libspdk_ioat.so.7.0 00:02:33.279 SYMLINK libspdk_dma.so 00:02:33.279 SYMLINK libspdk_ioat.so 00:02:33.279 LIB libspdk_vfio_user.a 00:02:33.279 SO libspdk_vfio_user.so.5.0 00:02:33.538 SYMLINK libspdk_vfio_user.so 00:02:33.538 LIB libspdk_util.a 00:02:33.538 SO libspdk_util.so.9.1 00:02:33.797 SYMLINK libspdk_util.so 00:02:33.797 LIB libspdk_trace_parser.a 00:02:33.797 SO libspdk_trace_parser.so.5.0 00:02:34.055 SYMLINK libspdk_trace_parser.so 00:02:34.055 CC lib/reduce/reduce.o 00:02:34.055 CC lib/env_dpdk/env.o 00:02:34.055 CC lib/env_dpdk/memory.o 00:02:34.055 CC lib/env_dpdk/pci.o 00:02:34.055 CC lib/rdma_utils/rdma_utils.o 00:02:34.055 CC lib/json/json_parse.o 00:02:34.055 CC lib/env_dpdk/init.o 00:02:34.055 CC lib/conf/conf.o 00:02:34.055 CC lib/env_dpdk/threads.o 00:02:34.055 CC lib/json/json_util.o 00:02:34.055 CC lib/json/json_write.o 00:02:34.055 CC lib/env_dpdk/pci_ioat.o 00:02:34.055 CC lib/env_dpdk/pci_virtio.o 00:02:34.055 CC lib/env_dpdk/pci_vmd.o 00:02:34.055 CC lib/idxd/idxd.o 00:02:34.055 CC lib/idxd/idxd_user.o 00:02:34.055 CC lib/env_dpdk/pci_idxd.o 00:02:34.055 CC lib/idxd/idxd_kernel.o 00:02:34.055 CC lib/env_dpdk/pci_event.o 00:02:34.055 CC lib/env_dpdk/sigbus_handler.o 00:02:34.055 CC lib/vmd/vmd.o 00:02:34.055 CC lib/env_dpdk/pci_dpdk.o 00:02:34.055 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:34.055 CC lib/rdma_provider/common.o 00:02:34.055 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:34.055 CC lib/vmd/led.o 00:02:34.055 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:34.314 LIB libspdk_rdma_provider.a 00:02:34.314 LIB libspdk_conf.a 00:02:34.314 SO libspdk_rdma_provider.so.6.0 00:02:34.572 SO libspdk_conf.so.6.0 00:02:34.572 LIB libspdk_json.a 00:02:34.572 SYMLINK libspdk_rdma_provider.so 00:02:34.572 SYMLINK libspdk_conf.so 00:02:34.572 SO libspdk_json.so.6.0 00:02:34.572 SYMLINK libspdk_json.so 00:02:34.830 LIB libspdk_rdma_utils.a 00:02:34.830 SO libspdk_rdma_utils.so.1.0 00:02:34.830 LIB libspdk_reduce.a 00:02:34.830 LIB libspdk_vmd.a 00:02:34.830 SO libspdk_reduce.so.6.0 00:02:34.830 SO libspdk_vmd.so.6.0 00:02:34.830 SYMLINK libspdk_rdma_utils.so 00:02:34.830 SYMLINK libspdk_reduce.so 00:02:34.830 CC lib/jsonrpc/jsonrpc_server.o 00:02:34.830 CC lib/jsonrpc/jsonrpc_client.o 00:02:34.830 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:34.830 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:34.830 SYMLINK libspdk_vmd.so 00:02:35.089 LIB libspdk_jsonrpc.a 00:02:35.347 LIB libspdk_idxd.a 00:02:35.347 SO libspdk_jsonrpc.so.6.0 00:02:35.347 SO libspdk_idxd.so.12.0 00:02:35.347 SYMLINK libspdk_jsonrpc.so 00:02:35.347 SYMLINK libspdk_idxd.so 00:02:35.606 LIB libspdk_env_dpdk.a 00:02:35.606 SO libspdk_env_dpdk.so.14.1 00:02:35.865 CC lib/rpc/rpc.o 00:02:35.865 SYMLINK libspdk_env_dpdk.so 00:02:36.125 LIB libspdk_rpc.a 00:02:36.125 SO libspdk_rpc.so.6.0 00:02:36.125 SYMLINK libspdk_rpc.so 00:02:36.384 CC lib/trace/trace.o 00:02:36.384 CC lib/trace/trace_flags.o 00:02:36.384 CC lib/trace/trace_rpc.o 00:02:36.384 CC lib/notify/notify.o 00:02:36.384 CC lib/notify/notify_rpc.o 00:02:36.384 CC lib/keyring/keyring.o 00:02:36.384 CC lib/keyring/keyring_rpc.o 00:02:36.643 LIB libspdk_notify.a 00:02:36.643 SO libspdk_notify.so.6.0 00:02:36.901 LIB libspdk_trace.a 00:02:36.901 SYMLINK libspdk_notify.so 00:02:36.901 SO libspdk_trace.so.10.0 00:02:36.901 SYMLINK libspdk_trace.so 00:02:36.901 LIB libspdk_keyring.a 00:02:37.160 SO libspdk_keyring.so.1.0 00:02:37.160 SYMLINK libspdk_keyring.so 00:02:37.160 CC lib/thread/thread.o 00:02:37.160 CC lib/thread/iobuf.o 00:02:37.160 CC lib/sock/sock.o 00:02:37.161 CC lib/sock/sock_rpc.o 00:02:37.729 LIB libspdk_sock.a 00:02:37.729 SO libspdk_sock.so.10.0 00:02:37.729 SYMLINK libspdk_sock.so 00:02:38.296 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:38.296 CC lib/nvme/nvme_ctrlr.o 00:02:38.296 CC lib/nvme/nvme_fabric.o 00:02:38.296 CC lib/nvme/nvme_ns_cmd.o 00:02:38.296 CC lib/nvme/nvme_pcie_common.o 00:02:38.296 CC lib/nvme/nvme_ns.o 00:02:38.296 CC lib/nvme/nvme_pcie.o 00:02:38.296 CC lib/nvme/nvme.o 00:02:38.296 CC lib/nvme/nvme_qpair.o 00:02:38.296 CC lib/nvme/nvme_quirks.o 00:02:38.296 CC lib/nvme/nvme_transport.o 00:02:38.296 CC lib/nvme/nvme_discovery.o 00:02:38.296 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:38.296 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:38.296 CC lib/nvme/nvme_tcp.o 00:02:38.296 CC lib/nvme/nvme_opal.o 00:02:38.296 CC lib/nvme/nvme_io_msg.o 00:02:38.296 CC lib/nvme/nvme_poll_group.o 00:02:38.296 CC lib/nvme/nvme_zns.o 00:02:38.296 CC lib/nvme/nvme_stubs.o 00:02:38.296 CC lib/nvme/nvme_auth.o 00:02:38.296 CC lib/nvme/nvme_cuse.o 00:02:38.296 CC lib/nvme/nvme_rdma.o 00:02:38.864 LIB libspdk_thread.a 00:02:38.864 SO libspdk_thread.so.10.1 00:02:38.864 SYMLINK libspdk_thread.so 00:02:39.431 CC lib/accel/accel.o 00:02:39.431 CC lib/blob/blobstore.o 00:02:39.431 CC lib/blob/zeroes.o 00:02:39.431 CC lib/accel/accel_rpc.o 00:02:39.431 CC lib/blob/request.o 00:02:39.431 CC lib/accel/accel_sw.o 00:02:39.431 CC lib/blob/blob_bs_dev.o 00:02:39.431 CC lib/virtio/virtio.o 00:02:39.431 CC lib/virtio/virtio_vhost_user.o 00:02:39.431 CC lib/virtio/virtio_vfio_user.o 00:02:39.431 CC lib/virtio/virtio_pci.o 00:02:39.431 CC lib/init/json_config.o 00:02:39.431 CC lib/init/subsystem.o 00:02:39.431 CC lib/init/subsystem_rpc.o 00:02:39.431 CC lib/init/rpc.o 00:02:39.690 LIB libspdk_init.a 00:02:39.690 SO libspdk_init.so.5.0 00:02:39.690 SYMLINK libspdk_init.so 00:02:39.987 LIB libspdk_nvme.a 00:02:39.987 LIB libspdk_virtio.a 00:02:39.987 SO libspdk_virtio.so.7.0 00:02:39.987 SO libspdk_nvme.so.13.1 00:02:40.246 CC lib/event/app.o 00:02:40.246 CC lib/event/reactor.o 00:02:40.246 CC lib/event/log_rpc.o 00:02:40.246 CC lib/event/app_rpc.o 00:02:40.246 CC lib/event/scheduler_static.o 00:02:40.246 SYMLINK libspdk_virtio.so 00:02:40.246 LIB libspdk_accel.a 00:02:40.246 SYMLINK libspdk_nvme.so 00:02:40.503 SO libspdk_accel.so.15.1 00:02:40.503 SYMLINK libspdk_accel.so 00:02:40.503 LIB libspdk_event.a 00:02:40.761 SO libspdk_event.so.14.0 00:02:40.761 SYMLINK libspdk_event.so 00:02:40.761 CC lib/bdev/bdev.o 00:02:40.761 CC lib/bdev/bdev_rpc.o 00:02:40.761 CC lib/bdev/part.o 00:02:40.761 CC lib/bdev/bdev_zone.o 00:02:40.761 CC lib/bdev/scsi_nvme.o 00:02:42.663 LIB libspdk_blob.a 00:02:42.922 SO libspdk_blob.so.11.0 00:02:42.922 SYMLINK libspdk_blob.so 00:02:43.180 CC lib/blobfs/blobfs.o 00:02:43.180 CC lib/blobfs/tree.o 00:02:43.439 CC lib/lvol/lvol.o 00:02:44.006 LIB libspdk_bdev.a 00:02:44.006 SO libspdk_bdev.so.15.1 00:02:44.264 LIB libspdk_blobfs.a 00:02:44.264 SO libspdk_blobfs.so.10.0 00:02:44.264 SYMLINK libspdk_bdev.so 00:02:44.264 SYMLINK libspdk_blobfs.so 00:02:44.264 LIB libspdk_lvol.a 00:02:44.526 SO libspdk_lvol.so.10.0 00:02:44.526 SYMLINK libspdk_lvol.so 00:02:44.526 CC lib/scsi/dev.o 00:02:44.526 CC lib/scsi/port.o 00:02:44.526 CC lib/scsi/lun.o 00:02:44.526 CC lib/scsi/scsi.o 00:02:44.526 CC lib/scsi/scsi_bdev.o 00:02:44.526 CC lib/scsi/scsi_pr.o 00:02:44.526 CC lib/scsi/scsi_rpc.o 00:02:44.526 CC lib/scsi/task.o 00:02:44.526 CC lib/ftl/ftl_core.o 00:02:44.526 CC lib/ftl/ftl_init.o 00:02:44.526 CC lib/ftl/ftl_debug.o 00:02:44.526 CC lib/ftl/ftl_layout.o 00:02:44.526 CC lib/nbd/nbd.o 00:02:44.526 CC lib/ftl/ftl_io.o 00:02:44.526 CC lib/nbd/nbd_rpc.o 00:02:44.526 CC lib/ftl/ftl_sb.o 00:02:44.526 CC lib/ftl/ftl_l2p.o 00:02:44.526 CC lib/ftl/ftl_nv_cache.o 00:02:44.526 CC lib/ftl/ftl_l2p_flat.o 00:02:44.526 CC lib/ftl/ftl_band.o 00:02:44.526 CC lib/ublk/ublk.o 00:02:44.526 CC lib/ftl/ftl_band_ops.o 00:02:44.526 CC lib/ublk/ublk_rpc.o 00:02:44.526 CC lib/ftl/ftl_writer.o 00:02:44.526 CC lib/nvmf/ctrlr.o 00:02:44.526 CC lib/ftl/ftl_rq.o 00:02:44.526 CC lib/nvmf/ctrlr_discovery.o 00:02:44.526 CC lib/ftl/ftl_reloc.o 00:02:44.526 CC lib/ftl/ftl_l2p_cache.o 00:02:44.526 CC lib/nvmf/ctrlr_bdev.o 00:02:44.526 CC lib/ftl/ftl_p2l.o 00:02:44.526 CC lib/nvmf/subsystem.o 00:02:44.526 CC lib/nvmf/nvmf.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:44.526 CC lib/nvmf/nvmf_rpc.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:44.526 CC lib/nvmf/tcp.o 00:02:44.526 CC lib/nvmf/transport.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:44.526 CC lib/nvmf/stubs.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:44.526 CC lib/nvmf/mdns_server.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:44.526 CC lib/nvmf/rdma.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:44.526 CC lib/nvmf/auth.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:44.526 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:44.526 CC lib/ftl/utils/ftl_conf.o 00:02:44.526 CC lib/ftl/utils/ftl_md.o 00:02:44.526 CC lib/ftl/utils/ftl_mempool.o 00:02:44.526 CC lib/ftl/utils/ftl_property.o 00:02:44.526 CC lib/ftl/utils/ftl_bitmap.o 00:02:44.526 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:44.526 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:44.527 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:44.527 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:44.527 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:44.527 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:44.527 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:44.527 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:44.527 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:44.784 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:44.784 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:44.784 CC lib/ftl/base/ftl_base_dev.o 00:02:44.784 CC lib/ftl/base/ftl_base_bdev.o 00:02:44.784 CC lib/ftl/ftl_trace.o 00:02:45.352 LIB libspdk_nbd.a 00:02:45.352 SO libspdk_nbd.so.7.0 00:02:45.352 SYMLINK libspdk_nbd.so 00:02:45.611 LIB libspdk_ublk.a 00:02:45.611 SO libspdk_ublk.so.3.0 00:02:45.611 SYMLINK libspdk_ublk.so 00:02:45.611 LIB libspdk_scsi.a 00:02:45.611 SO libspdk_scsi.so.9.0 00:02:45.869 SYMLINK libspdk_scsi.so 00:02:45.869 LIB libspdk_ftl.a 00:02:46.128 SO libspdk_ftl.so.9.0 00:02:46.128 CC lib/vhost/vhost.o 00:02:46.128 CC lib/vhost/vhost_rpc.o 00:02:46.128 CC lib/vhost/vhost_blk.o 00:02:46.128 CC lib/vhost/vhost_scsi.o 00:02:46.128 CC lib/vhost/rte_vhost_user.o 00:02:46.128 CC lib/iscsi/conn.o 00:02:46.128 CC lib/iscsi/init_grp.o 00:02:46.128 CC lib/iscsi/iscsi.o 00:02:46.128 CC lib/iscsi/md5.o 00:02:46.128 CC lib/iscsi/param.o 00:02:46.128 CC lib/iscsi/portal_grp.o 00:02:46.128 CC lib/iscsi/tgt_node.o 00:02:46.128 CC lib/iscsi/iscsi_subsystem.o 00:02:46.128 CC lib/iscsi/iscsi_rpc.o 00:02:46.128 CC lib/iscsi/task.o 00:02:46.695 SYMLINK libspdk_ftl.so 00:02:47.629 LIB libspdk_nvmf.a 00:02:47.629 LIB libspdk_iscsi.a 00:02:47.629 SO libspdk_nvmf.so.18.1 00:02:47.629 SO libspdk_iscsi.so.8.0 00:02:47.629 LIB libspdk_vhost.a 00:02:47.629 SO libspdk_vhost.so.8.0 00:02:47.888 SYMLINK libspdk_nvmf.so 00:02:47.888 SYMLINK libspdk_vhost.so 00:02:48.147 SYMLINK libspdk_iscsi.so 00:02:48.714 CC module/env_dpdk/env_dpdk_rpc.o 00:02:48.714 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:48.714 CC module/sock/posix/posix.o 00:02:48.714 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:48.714 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:48.714 CC module/accel/error/accel_error.o 00:02:48.714 CC module/accel/error/accel_error_rpc.o 00:02:48.714 LIB libspdk_env_dpdk_rpc.a 00:02:48.714 CC module/blob/bdev/blob_bdev.o 00:02:48.714 CC module/keyring/file/keyring.o 00:02:48.714 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:48.714 CC module/keyring/file/keyring_rpc.o 00:02:48.714 CC module/keyring/linux/keyring.o 00:02:48.714 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:48.714 CC module/accel/ioat/accel_ioat.o 00:02:48.714 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:48.714 CC module/keyring/linux/keyring_rpc.o 00:02:48.714 CC module/accel/ioat/accel_ioat_rpc.o 00:02:48.714 CC module/scheduler/gscheduler/gscheduler.o 00:02:48.714 CC module/accel/iaa/accel_iaa_rpc.o 00:02:48.714 CC module/accel/iaa/accel_iaa.o 00:02:48.714 CC module/accel/dsa/accel_dsa.o 00:02:48.714 CC module/accel/dsa/accel_dsa_rpc.o 00:02:48.714 SO libspdk_env_dpdk_rpc.so.6.0 00:02:48.972 SYMLINK libspdk_env_dpdk_rpc.so 00:02:48.972 LIB libspdk_keyring_linux.a 00:02:48.972 LIB libspdk_scheduler_gscheduler.a 00:02:48.972 LIB libspdk_keyring_file.a 00:02:48.972 LIB libspdk_scheduler_dynamic.a 00:02:48.972 LIB libspdk_accel_error.a 00:02:48.972 LIB libspdk_scheduler_dpdk_governor.a 00:02:48.972 SO libspdk_keyring_linux.so.1.0 00:02:48.972 SO libspdk_scheduler_gscheduler.so.4.0 00:02:48.972 SO libspdk_keyring_file.so.1.0 00:02:48.972 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:48.972 SO libspdk_scheduler_dynamic.so.4.0 00:02:48.972 SO libspdk_accel_error.so.2.0 00:02:48.972 SYMLINK libspdk_scheduler_gscheduler.so 00:02:48.972 SYMLINK libspdk_keyring_linux.so 00:02:48.972 LIB libspdk_blob_bdev.a 00:02:48.972 LIB libspdk_accel_dsa.a 00:02:48.972 SYMLINK libspdk_scheduler_dynamic.so 00:02:48.972 LIB libspdk_accel_ioat.a 00:02:48.972 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:48.972 SYMLINK libspdk_keyring_file.so 00:02:48.972 LIB libspdk_accel_iaa.a 00:02:49.230 SYMLINK libspdk_accel_error.so 00:02:49.230 SO libspdk_blob_bdev.so.11.0 00:02:49.230 SO libspdk_accel_dsa.so.5.0 00:02:49.230 SO libspdk_accel_ioat.so.6.0 00:02:49.230 SO libspdk_accel_iaa.so.3.0 00:02:49.230 SYMLINK libspdk_blob_bdev.so 00:02:49.230 SYMLINK libspdk_accel_ioat.so 00:02:49.230 SYMLINK libspdk_accel_dsa.so 00:02:49.230 SYMLINK libspdk_accel_iaa.so 00:02:49.488 LIB libspdk_sock_posix.a 00:02:49.488 SO libspdk_sock_posix.so.6.0 00:02:49.747 CC module/bdev/malloc/bdev_malloc.o 00:02:49.747 CC module/bdev/lvol/vbdev_lvol.o 00:02:49.747 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:49.747 CC module/bdev/crypto/vbdev_crypto.o 00:02:49.747 CC module/bdev/gpt/gpt.o 00:02:49.747 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:49.747 CC module/bdev/gpt/vbdev_gpt.o 00:02:49.747 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:49.747 CC module/bdev/delay/vbdev_delay.o 00:02:49.747 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:49.747 CC module/bdev/error/vbdev_error.o 00:02:49.747 CC module/bdev/error/vbdev_error_rpc.o 00:02:49.747 CC module/bdev/split/vbdev_split.o 00:02:49.747 CC module/bdev/split/vbdev_split_rpc.o 00:02:49.747 CC module/bdev/ftl/bdev_ftl.o 00:02:49.747 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:49.747 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:49.747 CC module/bdev/passthru/vbdev_passthru.o 00:02:49.747 CC module/bdev/compress/vbdev_compress.o 00:02:49.747 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:49.747 CC module/bdev/null/bdev_null.o 00:02:49.747 CC module/bdev/null/bdev_null_rpc.o 00:02:49.747 CC module/bdev/raid/bdev_raid.o 00:02:49.747 CC module/bdev/raid/bdev_raid_rpc.o 00:02:49.747 CC module/bdev/raid/bdev_raid_sb.o 00:02:49.747 CC module/bdev/raid/raid0.o 00:02:49.747 CC module/bdev/raid/concat.o 00:02:49.747 CC module/bdev/raid/raid1.o 00:02:49.747 CC module/bdev/aio/bdev_aio.o 00:02:49.747 SYMLINK libspdk_sock_posix.so 00:02:49.747 CC module/bdev/nvme/bdev_nvme.o 00:02:49.747 CC module/bdev/aio/bdev_aio_rpc.o 00:02:49.747 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:49.747 CC module/bdev/nvme/nvme_rpc.o 00:02:49.747 CC module/bdev/nvme/bdev_mdns_client.o 00:02:49.747 CC module/bdev/nvme/vbdev_opal.o 00:02:49.747 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:49.747 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:49.747 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:49.747 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:49.747 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:49.747 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:49.747 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:49.747 CC module/bdev/iscsi/bdev_iscsi.o 00:02:49.747 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:49.747 CC module/blobfs/bdev/blobfs_bdev.o 00:02:49.747 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:50.005 LIB libspdk_accel_dpdk_compressdev.a 00:02:50.005 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:50.005 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:50.005 LIB libspdk_bdev_null.a 00:02:50.005 LIB libspdk_bdev_error.a 00:02:50.005 LIB libspdk_bdev_passthru.a 00:02:50.005 SO libspdk_bdev_error.so.6.0 00:02:50.005 SO libspdk_bdev_null.so.6.0 00:02:50.005 LIB libspdk_bdev_split.a 00:02:50.005 LIB libspdk_bdev_malloc.a 00:02:50.005 SO libspdk_bdev_passthru.so.6.0 00:02:50.005 LIB libspdk_bdev_aio.a 00:02:50.005 LIB libspdk_bdev_compress.a 00:02:50.005 LIB libspdk_bdev_gpt.a 00:02:50.005 SO libspdk_bdev_split.so.6.0 00:02:50.263 SO libspdk_bdev_malloc.so.6.0 00:02:50.263 SO libspdk_bdev_aio.so.6.0 00:02:50.263 SYMLINK libspdk_bdev_null.so 00:02:50.263 SYMLINK libspdk_bdev_error.so 00:02:50.263 LIB libspdk_bdev_ftl.a 00:02:50.263 LIB libspdk_bdev_crypto.a 00:02:50.263 SYMLINK libspdk_bdev_passthru.so 00:02:50.263 SO libspdk_bdev_gpt.so.6.0 00:02:50.263 SO libspdk_bdev_compress.so.6.0 00:02:50.263 LIB libspdk_accel_dpdk_cryptodev.a 00:02:50.263 SO libspdk_bdev_crypto.so.6.0 00:02:50.263 SYMLINK libspdk_bdev_split.so 00:02:50.263 SO libspdk_bdev_ftl.so.6.0 00:02:50.263 SYMLINK libspdk_bdev_aio.so 00:02:50.263 SYMLINK libspdk_bdev_malloc.so 00:02:50.263 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:50.263 SYMLINK libspdk_bdev_gpt.so 00:02:50.263 SYMLINK libspdk_bdev_compress.so 00:02:50.263 LIB libspdk_bdev_iscsi.a 00:02:50.263 SYMLINK libspdk_bdev_crypto.so 00:02:50.263 LIB libspdk_bdev_delay.a 00:02:50.263 SYMLINK libspdk_bdev_ftl.so 00:02:50.263 LIB libspdk_blobfs_bdev.a 00:02:50.263 SO libspdk_bdev_iscsi.so.6.0 00:02:50.263 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:50.263 SO libspdk_bdev_delay.so.6.0 00:02:50.263 LIB libspdk_bdev_zone_block.a 00:02:50.263 SO libspdk_blobfs_bdev.so.6.0 00:02:50.522 SYMLINK libspdk_bdev_iscsi.so 00:02:50.522 SO libspdk_bdev_zone_block.so.6.0 00:02:50.522 SYMLINK libspdk_bdev_delay.so 00:02:50.522 LIB libspdk_bdev_virtio.a 00:02:50.522 SYMLINK libspdk_blobfs_bdev.so 00:02:50.522 SO libspdk_bdev_virtio.so.6.0 00:02:50.522 LIB libspdk_bdev_lvol.a 00:02:50.522 SYMLINK libspdk_bdev_zone_block.so 00:02:50.522 LIB libspdk_bdev_raid.a 00:02:50.522 SO libspdk_bdev_lvol.so.6.0 00:02:50.522 SYMLINK libspdk_bdev_virtio.so 00:02:50.522 SO libspdk_bdev_raid.so.6.0 00:02:50.522 SYMLINK libspdk_bdev_lvol.so 00:02:50.781 SYMLINK libspdk_bdev_raid.so 00:02:52.156 LIB libspdk_bdev_nvme.a 00:02:52.156 SO libspdk_bdev_nvme.so.7.0 00:02:52.415 SYMLINK libspdk_bdev_nvme.so 00:02:52.981 CC module/event/subsystems/iobuf/iobuf.o 00:02:52.981 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:52.981 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:52.981 CC module/event/subsystems/vmd/vmd.o 00:02:52.981 CC module/event/subsystems/keyring/keyring.o 00:02:52.981 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:52.981 CC module/event/subsystems/sock/sock.o 00:02:52.981 CC module/event/subsystems/scheduler/scheduler.o 00:02:53.240 LIB libspdk_event_keyring.a 00:02:53.240 LIB libspdk_event_vhost_blk.a 00:02:53.240 LIB libspdk_event_sock.a 00:02:53.240 LIB libspdk_event_scheduler.a 00:02:53.240 LIB libspdk_event_iobuf.a 00:02:53.240 SO libspdk_event_keyring.so.1.0 00:02:53.240 SO libspdk_event_vhost_blk.so.3.0 00:02:53.240 SO libspdk_event_sock.so.5.0 00:02:53.240 SO libspdk_event_scheduler.so.4.0 00:02:53.240 SO libspdk_event_iobuf.so.3.0 00:02:53.240 SYMLINK libspdk_event_keyring.so 00:02:53.240 SYMLINK libspdk_event_vhost_blk.so 00:02:53.240 SYMLINK libspdk_event_scheduler.so 00:02:53.498 SYMLINK libspdk_event_sock.so 00:02:53.498 SYMLINK libspdk_event_iobuf.so 00:02:53.498 LIB libspdk_event_vmd.a 00:02:53.498 SO libspdk_event_vmd.so.6.0 00:02:53.498 SYMLINK libspdk_event_vmd.so 00:02:53.757 CC module/event/subsystems/accel/accel.o 00:02:54.015 LIB libspdk_event_accel.a 00:02:54.015 SO libspdk_event_accel.so.6.0 00:02:54.015 SYMLINK libspdk_event_accel.so 00:02:54.590 CC module/event/subsystems/bdev/bdev.o 00:02:54.590 LIB libspdk_event_bdev.a 00:02:54.590 SO libspdk_event_bdev.so.6.0 00:02:54.848 SYMLINK libspdk_event_bdev.so 00:02:55.107 CC module/event/subsystems/nbd/nbd.o 00:02:55.107 CC module/event/subsystems/scsi/scsi.o 00:02:55.107 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:55.107 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:55.107 CC module/event/subsystems/ublk/ublk.o 00:02:55.365 LIB libspdk_event_nbd.a 00:02:55.365 LIB libspdk_event_scsi.a 00:02:55.365 SO libspdk_event_nbd.so.6.0 00:02:55.365 SO libspdk_event_scsi.so.6.0 00:02:55.365 SYMLINK libspdk_event_scsi.so 00:02:55.365 SYMLINK libspdk_event_nbd.so 00:02:55.365 LIB libspdk_event_ublk.a 00:02:55.365 SO libspdk_event_ublk.so.3.0 00:02:55.623 LIB libspdk_event_nvmf.a 00:02:55.623 SYMLINK libspdk_event_ublk.so 00:02:55.623 SO libspdk_event_nvmf.so.6.0 00:02:55.623 SYMLINK libspdk_event_nvmf.so 00:02:55.881 CC module/event/subsystems/iscsi/iscsi.o 00:02:55.881 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:55.881 LIB libspdk_event_vhost_scsi.a 00:02:55.881 LIB libspdk_event_iscsi.a 00:02:55.881 SO libspdk_event_vhost_scsi.so.3.0 00:02:56.140 SO libspdk_event_iscsi.so.6.0 00:02:56.140 SYMLINK libspdk_event_vhost_scsi.so 00:02:56.140 SYMLINK libspdk_event_iscsi.so 00:02:56.399 SO libspdk.so.6.0 00:02:56.399 SYMLINK libspdk.so 00:02:56.667 CC app/spdk_lspci/spdk_lspci.o 00:02:56.667 CXX app/trace/trace.o 00:02:56.667 CC test/rpc_client/rpc_client_test.o 00:02:56.667 TEST_HEADER include/spdk/accel.h 00:02:56.667 TEST_HEADER include/spdk/assert.h 00:02:56.667 TEST_HEADER include/spdk/barrier.h 00:02:56.667 TEST_HEADER include/spdk/accel_module.h 00:02:56.667 TEST_HEADER include/spdk/base64.h 00:02:56.667 TEST_HEADER include/spdk/bdev.h 00:02:56.667 CC app/spdk_nvme_identify/identify.o 00:02:56.667 CC app/spdk_top/spdk_top.o 00:02:56.667 TEST_HEADER include/spdk/bdev_module.h 00:02:56.667 TEST_HEADER include/spdk/bdev_zone.h 00:02:56.667 CC app/spdk_nvme_discover/discovery_aer.o 00:02:56.667 TEST_HEADER include/spdk/bit_array.h 00:02:56.667 TEST_HEADER include/spdk/blob_bdev.h 00:02:56.667 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:56.667 TEST_HEADER include/spdk/bit_pool.h 00:02:56.667 TEST_HEADER include/spdk/blobfs.h 00:02:56.667 TEST_HEADER include/spdk/blob.h 00:02:56.667 TEST_HEADER include/spdk/config.h 00:02:56.667 TEST_HEADER include/spdk/conf.h 00:02:56.667 TEST_HEADER include/spdk/cpuset.h 00:02:56.667 CC app/trace_record/trace_record.o 00:02:56.667 TEST_HEADER include/spdk/crc32.h 00:02:56.667 TEST_HEADER include/spdk/crc16.h 00:02:56.667 TEST_HEADER include/spdk/crc64.h 00:02:56.667 TEST_HEADER include/spdk/dif.h 00:02:56.667 TEST_HEADER include/spdk/endian.h 00:02:56.667 TEST_HEADER include/spdk/dma.h 00:02:56.667 TEST_HEADER include/spdk/env.h 00:02:56.667 TEST_HEADER include/spdk/env_dpdk.h 00:02:56.667 TEST_HEADER include/spdk/event.h 00:02:56.667 CC app/spdk_nvme_perf/perf.o 00:02:56.667 TEST_HEADER include/spdk/fd_group.h 00:02:56.667 TEST_HEADER include/spdk/file.h 00:02:56.667 TEST_HEADER include/spdk/fd.h 00:02:56.667 TEST_HEADER include/spdk/gpt_spec.h 00:02:56.667 TEST_HEADER include/spdk/ftl.h 00:02:56.667 TEST_HEADER include/spdk/hexlify.h 00:02:56.667 TEST_HEADER include/spdk/histogram_data.h 00:02:56.667 TEST_HEADER include/spdk/idxd.h 00:02:56.667 TEST_HEADER include/spdk/idxd_spec.h 00:02:56.667 TEST_HEADER include/spdk/init.h 00:02:56.667 TEST_HEADER include/spdk/ioat.h 00:02:56.667 TEST_HEADER include/spdk/ioat_spec.h 00:02:56.667 TEST_HEADER include/spdk/iscsi_spec.h 00:02:56.667 TEST_HEADER include/spdk/json.h 00:02:56.667 TEST_HEADER include/spdk/jsonrpc.h 00:02:56.667 TEST_HEADER include/spdk/keyring.h 00:02:56.667 TEST_HEADER include/spdk/keyring_module.h 00:02:56.667 TEST_HEADER include/spdk/likely.h 00:02:56.667 TEST_HEADER include/spdk/log.h 00:02:56.667 TEST_HEADER include/spdk/lvol.h 00:02:56.667 TEST_HEADER include/spdk/memory.h 00:02:56.667 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:56.667 TEST_HEADER include/spdk/mmio.h 00:02:56.667 TEST_HEADER include/spdk/nbd.h 00:02:56.667 TEST_HEADER include/spdk/notify.h 00:02:56.667 TEST_HEADER include/spdk/nvme.h 00:02:56.667 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:56.667 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:56.667 TEST_HEADER include/spdk/nvme_intel.h 00:02:56.667 TEST_HEADER include/spdk/nvme_zns.h 00:02:56.667 TEST_HEADER include/spdk/nvme_spec.h 00:02:56.667 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:56.667 TEST_HEADER include/spdk/nvmf.h 00:02:56.667 TEST_HEADER include/spdk/nvmf_spec.h 00:02:56.667 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:56.667 TEST_HEADER include/spdk/opal.h 00:02:56.667 TEST_HEADER include/spdk/opal_spec.h 00:02:56.667 TEST_HEADER include/spdk/nvmf_transport.h 00:02:56.667 TEST_HEADER include/spdk/pci_ids.h 00:02:56.667 TEST_HEADER include/spdk/pipe.h 00:02:56.667 TEST_HEADER include/spdk/queue.h 00:02:56.667 TEST_HEADER include/spdk/reduce.h 00:02:56.667 CC app/iscsi_tgt/iscsi_tgt.o 00:02:56.667 TEST_HEADER include/spdk/rpc.h 00:02:56.667 TEST_HEADER include/spdk/scheduler.h 00:02:56.667 TEST_HEADER include/spdk/scsi.h 00:02:56.667 TEST_HEADER include/spdk/sock.h 00:02:56.667 TEST_HEADER include/spdk/stdinc.h 00:02:56.667 TEST_HEADER include/spdk/scsi_spec.h 00:02:56.667 TEST_HEADER include/spdk/thread.h 00:02:56.667 TEST_HEADER include/spdk/string.h 00:02:56.667 TEST_HEADER include/spdk/trace.h 00:02:56.667 TEST_HEADER include/spdk/trace_parser.h 00:02:56.667 TEST_HEADER include/spdk/tree.h 00:02:56.667 TEST_HEADER include/spdk/ublk.h 00:02:56.667 CC app/nvmf_tgt/nvmf_main.o 00:02:56.667 TEST_HEADER include/spdk/util.h 00:02:56.667 TEST_HEADER include/spdk/uuid.h 00:02:56.667 CC app/spdk_dd/spdk_dd.o 00:02:56.667 TEST_HEADER include/spdk/version.h 00:02:56.667 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:56.667 TEST_HEADER include/spdk/vhost.h 00:02:56.667 TEST_HEADER include/spdk/vmd.h 00:02:56.667 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:56.667 TEST_HEADER include/spdk/xor.h 00:02:56.667 TEST_HEADER include/spdk/zipf.h 00:02:56.667 CXX test/cpp_headers/accel.o 00:02:56.667 CC app/spdk_tgt/spdk_tgt.o 00:02:56.667 CXX test/cpp_headers/accel_module.o 00:02:56.667 CXX test/cpp_headers/assert.o 00:02:56.667 CXX test/cpp_headers/base64.o 00:02:56.667 CXX test/cpp_headers/bdev.o 00:02:56.667 CXX test/cpp_headers/barrier.o 00:02:56.667 CXX test/cpp_headers/bdev_module.o 00:02:56.667 CXX test/cpp_headers/bdev_zone.o 00:02:56.667 CXX test/cpp_headers/bit_array.o 00:02:56.667 CXX test/cpp_headers/blob_bdev.o 00:02:56.667 CXX test/cpp_headers/bit_pool.o 00:02:56.667 CXX test/cpp_headers/blobfs_bdev.o 00:02:56.667 CXX test/cpp_headers/blobfs.o 00:02:56.667 CXX test/cpp_headers/blob.o 00:02:56.667 CXX test/cpp_headers/conf.o 00:02:56.667 CXX test/cpp_headers/config.o 00:02:56.667 CXX test/cpp_headers/cpuset.o 00:02:56.667 CXX test/cpp_headers/crc16.o 00:02:56.667 CXX test/cpp_headers/crc32.o 00:02:56.667 CXX test/cpp_headers/crc64.o 00:02:56.667 CXX test/cpp_headers/dma.o 00:02:56.668 CXX test/cpp_headers/dif.o 00:02:56.668 CXX test/cpp_headers/endian.o 00:02:56.668 CXX test/cpp_headers/env_dpdk.o 00:02:56.668 CXX test/cpp_headers/env.o 00:02:56.668 CXX test/cpp_headers/event.o 00:02:56.668 CXX test/cpp_headers/fd_group.o 00:02:56.668 CXX test/cpp_headers/fd.o 00:02:56.668 CXX test/cpp_headers/file.o 00:02:56.668 CXX test/cpp_headers/ftl.o 00:02:56.668 CXX test/cpp_headers/hexlify.o 00:02:56.668 CXX test/cpp_headers/gpt_spec.o 00:02:56.668 CXX test/cpp_headers/histogram_data.o 00:02:56.668 CXX test/cpp_headers/idxd.o 00:02:56.668 CXX test/cpp_headers/idxd_spec.o 00:02:56.668 CXX test/cpp_headers/ioat.o 00:02:56.668 CXX test/cpp_headers/ioat_spec.o 00:02:56.668 CXX test/cpp_headers/json.o 00:02:56.668 CXX test/cpp_headers/init.o 00:02:56.668 CXX test/cpp_headers/jsonrpc.o 00:02:56.668 CXX test/cpp_headers/keyring.o 00:02:56.668 CXX test/cpp_headers/iscsi_spec.o 00:02:56.934 LINK spdk_lspci 00:02:56.934 CC test/env/pci/pci_ut.o 00:02:56.934 CC examples/ioat/verify/verify.o 00:02:56.934 CC test/env/vtophys/vtophys.o 00:02:56.934 CC examples/ioat/perf/perf.o 00:02:56.934 CC examples/util/zipf/zipf.o 00:02:56.934 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:56.934 CC test/app/stub/stub.o 00:02:56.934 CC test/app/jsoncat/jsoncat.o 00:02:56.934 CC test/thread/poller_perf/poller_perf.o 00:02:56.934 CXX test/cpp_headers/keyring_module.o 00:02:56.934 CC test/app/histogram_perf/histogram_perf.o 00:02:56.934 CC app/fio/bdev/fio_plugin.o 00:02:56.934 CC test/env/memory/memory_ut.o 00:02:56.934 CC app/fio/nvme/fio_plugin.o 00:02:56.934 CC test/app/bdev_svc/bdev_svc.o 00:02:57.201 LINK rpc_client_test 00:02:57.201 CC test/env/mem_callbacks/mem_callbacks.o 00:02:57.201 LINK spdk_nvme_discover 00:02:57.201 LINK interrupt_tgt 00:02:57.201 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:57.201 LINK spdk_trace_record 00:02:57.201 CC test/dma/test_dma/test_dma.o 00:02:57.201 LINK nvmf_tgt 00:02:57.201 CXX test/cpp_headers/likely.o 00:02:57.201 CXX test/cpp_headers/log.o 00:02:57.201 LINK iscsi_tgt 00:02:57.201 CXX test/cpp_headers/lvol.o 00:02:57.201 LINK jsoncat 00:02:57.201 CXX test/cpp_headers/memory.o 00:02:57.201 LINK histogram_perf 00:02:57.201 LINK poller_perf 00:02:57.201 CXX test/cpp_headers/mmio.o 00:02:57.201 CXX test/cpp_headers/nbd.o 00:02:57.201 CXX test/cpp_headers/notify.o 00:02:57.201 CXX test/cpp_headers/nvme.o 00:02:57.201 CXX test/cpp_headers/nvme_intel.o 00:02:57.468 CXX test/cpp_headers/nvme_ocssd.o 00:02:57.468 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:57.468 CXX test/cpp_headers/nvme_spec.o 00:02:57.468 CXX test/cpp_headers/nvme_zns.o 00:02:57.468 CXX test/cpp_headers/nvmf_cmd.o 00:02:57.468 LINK stub 00:02:57.468 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:57.468 CXX test/cpp_headers/nvmf.o 00:02:57.468 CXX test/cpp_headers/nvmf_spec.o 00:02:57.468 CXX test/cpp_headers/nvmf_transport.o 00:02:57.468 CXX test/cpp_headers/opal.o 00:02:57.468 LINK vtophys 00:02:57.468 CXX test/cpp_headers/opal_spec.o 00:02:57.468 LINK verify 00:02:57.468 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:57.468 LINK ioat_perf 00:02:57.468 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:57.468 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:57.468 CXX test/cpp_headers/pci_ids.o 00:02:57.468 CXX test/cpp_headers/pipe.o 00:02:57.468 CXX test/cpp_headers/queue.o 00:02:57.468 CXX test/cpp_headers/reduce.o 00:02:57.468 CXX test/cpp_headers/scheduler.o 00:02:57.468 CXX test/cpp_headers/rpc.o 00:02:57.468 CXX test/cpp_headers/scsi.o 00:02:57.468 CXX test/cpp_headers/scsi_spec.o 00:02:57.468 CXX test/cpp_headers/sock.o 00:02:57.468 CXX test/cpp_headers/stdinc.o 00:02:57.468 CXX test/cpp_headers/string.o 00:02:57.468 CXX test/cpp_headers/thread.o 00:02:57.468 CXX test/cpp_headers/trace.o 00:02:57.468 CXX test/cpp_headers/trace_parser.o 00:02:57.468 LINK spdk_trace 00:02:57.468 CXX test/cpp_headers/tree.o 00:02:57.468 CXX test/cpp_headers/ublk.o 00:02:57.468 CXX test/cpp_headers/util.o 00:02:57.468 CXX test/cpp_headers/uuid.o 00:02:57.468 CXX test/cpp_headers/version.o 00:02:57.468 CXX test/cpp_headers/vfio_user_pci.o 00:02:57.468 CXX test/cpp_headers/vfio_user_spec.o 00:02:57.468 CXX test/cpp_headers/vhost.o 00:02:57.468 CXX test/cpp_headers/vmd.o 00:02:57.727 CXX test/cpp_headers/xor.o 00:02:57.727 CXX test/cpp_headers/zipf.o 00:02:57.727 LINK spdk_tgt 00:02:57.727 LINK pci_ut 00:02:57.727 LINK bdev_svc 00:02:57.727 LINK env_dpdk_post_init 00:02:57.727 LINK zipf 00:02:57.727 LINK spdk_dd 00:02:57.727 LINK spdk_nvme 00:02:57.727 LINK nvme_fuzz 00:02:57.985 CC test/event/reactor_perf/reactor_perf.o 00:02:57.985 CC test/event/event_perf/event_perf.o 00:02:57.985 CC test/event/reactor/reactor.o 00:02:57.985 CC test/event/app_repeat/app_repeat.o 00:02:57.985 CC test/event/scheduler/scheduler.o 00:02:57.985 CC app/vhost/vhost.o 00:02:57.985 LINK spdk_bdev 00:02:57.985 LINK mem_callbacks 00:02:57.985 LINK test_dma 00:02:57.985 LINK reactor 00:02:58.244 LINK reactor_perf 00:02:58.244 LINK spdk_top 00:02:58.244 LINK event_perf 00:02:58.244 LINK spdk_nvme_identify 00:02:58.244 LINK app_repeat 00:02:58.244 LINK vhost_fuzz 00:02:58.244 LINK spdk_nvme_perf 00:02:58.244 LINK vhost 00:02:58.244 LINK scheduler 00:02:58.244 CC examples/vmd/lsvmd/lsvmd.o 00:02:58.244 CC examples/vmd/led/led.o 00:02:58.244 CC examples/sock/hello_world/hello_sock.o 00:02:58.244 CC examples/thread/thread/thread_ex.o 00:02:58.244 CC examples/idxd/perf/perf.o 00:02:58.503 LINK lsvmd 00:02:58.503 LINK led 00:02:58.503 LINK memory_ut 00:02:58.503 LINK hello_sock 00:02:58.761 LINK thread 00:02:58.761 CC test/nvme/e2edp/nvme_dp.o 00:02:58.761 CC test/nvme/connect_stress/connect_stress.o 00:02:58.761 CC test/nvme/overhead/overhead.o 00:02:58.761 CC test/nvme/err_injection/err_injection.o 00:02:58.761 CC test/nvme/startup/startup.o 00:02:58.761 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:58.761 CC test/nvme/reserve/reserve.o 00:02:58.761 CC test/nvme/cuse/cuse.o 00:02:58.761 CC test/nvme/fused_ordering/fused_ordering.o 00:02:58.761 LINK idxd_perf 00:02:58.761 CC test/nvme/sgl/sgl.o 00:02:58.761 CC test/nvme/simple_copy/simple_copy.o 00:02:58.761 CC test/nvme/aer/aer.o 00:02:58.761 CC test/nvme/compliance/nvme_compliance.o 00:02:58.761 CC test/nvme/reset/reset.o 00:02:58.761 CC test/nvme/boot_partition/boot_partition.o 00:02:58.761 CC test/nvme/fdp/fdp.o 00:02:58.761 CC test/blobfs/mkfs/mkfs.o 00:02:58.761 CC test/accel/dif/dif.o 00:02:58.761 CC test/lvol/esnap/esnap.o 00:02:58.761 LINK startup 00:02:59.019 LINK doorbell_aers 00:02:59.019 LINK connect_stress 00:02:59.019 LINK err_injection 00:02:59.019 LINK boot_partition 00:02:59.019 LINK reserve 00:02:59.019 LINK simple_copy 00:02:59.019 LINK mkfs 00:02:59.019 LINK nvme_dp 00:02:59.019 LINK reset 00:02:59.019 LINK sgl 00:02:59.019 LINK overhead 00:02:59.019 LINK nvme_compliance 00:02:59.019 CC examples/nvme/reconnect/reconnect.o 00:02:59.019 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:59.019 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:59.019 CC examples/nvme/hello_world/hello_world.o 00:02:59.019 CC examples/nvme/abort/abort.o 00:02:59.019 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:59.019 LINK fdp 00:02:59.019 CC examples/nvme/hotplug/hotplug.o 00:02:59.019 CC examples/nvme/arbitration/arbitration.o 00:02:59.277 LINK fused_ordering 00:02:59.277 CC examples/accel/perf/accel_perf.o 00:02:59.277 CC examples/blob/hello_world/hello_blob.o 00:02:59.277 CC examples/blob/cli/blobcli.o 00:02:59.277 LINK cmb_copy 00:02:59.277 LINK pmr_persistence 00:02:59.277 LINK hello_world 00:02:59.277 LINK hotplug 00:02:59.277 LINK aer 00:02:59.535 LINK dif 00:02:59.535 LINK iscsi_fuzz 00:02:59.535 LINK arbitration 00:02:59.535 LINK reconnect 00:02:59.535 LINK abort 00:02:59.535 LINK hello_blob 00:02:59.535 LINK nvme_manage 00:02:59.793 LINK accel_perf 00:02:59.793 LINK blobcli 00:03:00.052 LINK cuse 00:03:00.052 CC test/bdev/bdevio/bdevio.o 00:03:00.309 CC examples/bdev/hello_world/hello_bdev.o 00:03:00.309 CC examples/bdev/bdevperf/bdevperf.o 00:03:00.568 LINK bdevio 00:03:00.568 LINK hello_bdev 00:03:01.134 LINK bdevperf 00:03:02.069 CC examples/nvmf/nvmf/nvmf.o 00:03:02.069 LINK nvmf 00:03:03.971 LINK esnap 00:03:04.230 00:03:04.230 real 1m35.695s 00:03:04.230 user 18m9.632s 00:03:04.230 sys 4m20.951s 00:03:04.230 13:28:52 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:03:04.230 13:28:52 make -- common/autotest_common.sh@10 -- $ set +x 00:03:04.230 ************************************ 00:03:04.230 END TEST make 00:03:04.230 ************************************ 00:03:04.488 13:28:52 -- common/autotest_common.sh@1142 -- $ return 0 00:03:04.488 13:28:52 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:04.488 13:28:52 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:04.488 13:28:52 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:04.488 13:28:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:04.488 13:28:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:04.488 13:28:52 -- pm/common@44 -- $ pid=273463 00:03:04.488 13:28:52 -- pm/common@50 -- $ kill -TERM 273463 00:03:04.488 13:28:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:04.488 13:28:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:04.488 13:28:52 -- pm/common@44 -- $ pid=273465 00:03:04.488 13:28:52 -- pm/common@50 -- $ kill -TERM 273465 00:03:04.488 13:28:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:04.488 13:28:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:04.488 13:28:52 -- pm/common@44 -- $ pid=273467 00:03:04.488 13:28:52 -- pm/common@50 -- $ kill -TERM 273467 00:03:04.488 13:28:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:04.488 13:28:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:04.488 13:28:52 -- pm/common@44 -- $ pid=273493 00:03:04.488 13:28:52 -- pm/common@50 -- $ sudo -E kill -TERM 273493 00:03:04.488 13:28:52 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:04.488 13:28:52 -- nvmf/common.sh@7 -- # uname -s 00:03:04.488 13:28:52 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:04.488 13:28:52 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:04.488 13:28:52 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:04.488 13:28:52 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:04.488 13:28:52 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:04.488 13:28:52 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:04.488 13:28:52 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:04.489 13:28:52 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:04.489 13:28:52 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:04.489 13:28:52 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:04.489 13:28:52 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:03:04.489 13:28:52 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:03:04.489 13:28:53 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:04.489 13:28:53 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:04.489 13:28:53 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:04.489 13:28:53 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:04.489 13:28:53 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:04.489 13:28:53 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:04.489 13:28:53 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:04.489 13:28:53 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:04.489 13:28:53 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:04.489 13:28:53 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:04.489 13:28:53 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:04.489 13:28:53 -- paths/export.sh@5 -- # export PATH 00:03:04.489 13:28:53 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:04.489 13:28:53 -- nvmf/common.sh@47 -- # : 0 00:03:04.489 13:28:53 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:04.489 13:28:53 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:04.489 13:28:53 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:04.489 13:28:53 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:04.489 13:28:53 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:04.489 13:28:53 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:04.489 13:28:53 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:04.489 13:28:53 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:04.489 13:28:53 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:04.489 13:28:53 -- spdk/autotest.sh@32 -- # uname -s 00:03:04.489 13:28:53 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:04.489 13:28:53 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:04.489 13:28:53 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:04.489 13:28:53 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:04.489 13:28:53 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:04.489 13:28:53 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:04.489 13:28:53 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:04.489 13:28:53 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:04.489 13:28:53 -- spdk/autotest.sh@48 -- # udevadm_pid=340982 00:03:04.489 13:28:53 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:04.489 13:28:53 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:04.489 13:28:53 -- pm/common@17 -- # local monitor 00:03:04.489 13:28:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:04.489 13:28:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:04.489 13:28:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:04.489 13:28:53 -- pm/common@21 -- # date +%s 00:03:04.489 13:28:53 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:04.489 13:28:53 -- pm/common@21 -- # date +%s 00:03:04.489 13:28:53 -- pm/common@25 -- # sleep 1 00:03:04.489 13:28:53 -- pm/common@21 -- # date +%s 00:03:04.489 13:28:53 -- pm/common@21 -- # date +%s 00:03:04.489 13:28:53 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720783733 00:03:04.489 13:28:53 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720783733 00:03:04.489 13:28:53 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720783733 00:03:04.489 13:28:53 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720783733 00:03:04.747 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720783733_collect-vmstat.pm.log 00:03:04.747 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720783733_collect-cpu-load.pm.log 00:03:04.747 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720783733_collect-cpu-temp.pm.log 00:03:04.747 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720783733_collect-bmc-pm.bmc.pm.log 00:03:05.681 13:28:54 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:05.681 13:28:54 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:05.681 13:28:54 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:05.681 13:28:54 -- common/autotest_common.sh@10 -- # set +x 00:03:05.681 13:28:54 -- spdk/autotest.sh@59 -- # create_test_list 00:03:05.681 13:28:54 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:05.681 13:28:54 -- common/autotest_common.sh@10 -- # set +x 00:03:05.681 13:28:54 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:05.681 13:28:54 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:05.681 13:28:54 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:05.681 13:28:54 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:05.681 13:28:54 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:05.681 13:28:54 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:05.681 13:28:54 -- common/autotest_common.sh@1455 -- # uname 00:03:05.681 13:28:54 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:05.681 13:28:54 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:05.681 13:28:54 -- common/autotest_common.sh@1475 -- # uname 00:03:05.681 13:28:54 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:05.681 13:28:54 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:05.681 13:28:54 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:05.681 13:28:54 -- spdk/autotest.sh@72 -- # hash lcov 00:03:05.681 13:28:54 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:05.681 13:28:54 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:05.681 --rc lcov_branch_coverage=1 00:03:05.681 --rc lcov_function_coverage=1 00:03:05.681 --rc genhtml_branch_coverage=1 00:03:05.681 --rc genhtml_function_coverage=1 00:03:05.681 --rc genhtml_legend=1 00:03:05.681 --rc geninfo_all_blocks=1 00:03:05.681 ' 00:03:05.681 13:28:54 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:05.681 --rc lcov_branch_coverage=1 00:03:05.681 --rc lcov_function_coverage=1 00:03:05.681 --rc genhtml_branch_coverage=1 00:03:05.681 --rc genhtml_function_coverage=1 00:03:05.681 --rc genhtml_legend=1 00:03:05.681 --rc geninfo_all_blocks=1 00:03:05.681 ' 00:03:05.681 13:28:54 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:05.681 --rc lcov_branch_coverage=1 00:03:05.681 --rc lcov_function_coverage=1 00:03:05.681 --rc genhtml_branch_coverage=1 00:03:05.681 --rc genhtml_function_coverage=1 00:03:05.681 --rc genhtml_legend=1 00:03:05.681 --rc geninfo_all_blocks=1 00:03:05.681 --no-external' 00:03:05.681 13:28:54 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:05.681 --rc lcov_branch_coverage=1 00:03:05.681 --rc lcov_function_coverage=1 00:03:05.681 --rc genhtml_branch_coverage=1 00:03:05.681 --rc genhtml_function_coverage=1 00:03:05.681 --rc genhtml_legend=1 00:03:05.681 --rc geninfo_all_blocks=1 00:03:05.681 --no-external' 00:03:05.681 13:28:54 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:05.681 lcov: LCOV version 1.14 00:03:05.681 13:28:54 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:23.759 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:23.759 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:35.954 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:35.954 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:35.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:35.955 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:35.956 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:35.956 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:41.223 13:29:28 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:41.223 13:29:28 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:41.223 13:29:28 -- common/autotest_common.sh@10 -- # set +x 00:03:41.223 13:29:28 -- spdk/autotest.sh@91 -- # rm -f 00:03:41.223 13:29:28 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:43.755 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:43.755 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:43.755 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:03:43.755 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:43.755 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:44.013 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:44.013 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:44.013 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:44.013 13:29:32 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:44.013 13:29:32 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:44.013 13:29:32 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:44.013 13:29:32 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:44.013 13:29:32 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:44.013 13:29:32 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:44.013 13:29:32 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:44.013 13:29:32 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:44.013 13:29:32 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:44.013 13:29:32 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:44.013 13:29:32 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:44.013 13:29:32 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:44.013 13:29:32 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:44.013 13:29:32 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:44.013 13:29:32 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:44.013 No valid GPT data, bailing 00:03:44.013 13:29:32 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:44.013 13:29:32 -- scripts/common.sh@391 -- # pt= 00:03:44.013 13:29:32 -- scripts/common.sh@392 -- # return 1 00:03:44.013 13:29:32 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:44.013 1+0 records in 00:03:44.013 1+0 records out 00:03:44.013 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00509561 s, 206 MB/s 00:03:44.013 13:29:32 -- spdk/autotest.sh@118 -- # sync 00:03:44.013 13:29:32 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:44.013 13:29:32 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:44.013 13:29:32 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:49.294 13:29:37 -- spdk/autotest.sh@124 -- # uname -s 00:03:49.294 13:29:37 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:49.294 13:29:37 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:49.294 13:29:37 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:49.294 13:29:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:49.294 13:29:37 -- common/autotest_common.sh@10 -- # set +x 00:03:49.294 ************************************ 00:03:49.294 START TEST setup.sh 00:03:49.294 ************************************ 00:03:49.294 13:29:37 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:49.294 * Looking for test storage... 00:03:49.294 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:49.294 13:29:37 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:49.294 13:29:37 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:49.294 13:29:37 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:49.294 13:29:37 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:49.294 13:29:37 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:49.294 13:29:37 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:49.294 ************************************ 00:03:49.294 START TEST acl 00:03:49.294 ************************************ 00:03:49.294 13:29:37 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:49.294 * Looking for test storage... 00:03:49.294 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:49.294 13:29:37 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:49.294 13:29:37 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:49.294 13:29:37 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:49.294 13:29:37 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:49.294 13:29:37 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:49.294 13:29:37 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:49.294 13:29:37 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:49.294 13:29:37 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:49.294 13:29:37 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:49.294 13:29:37 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:49.294 13:29:37 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:49.294 13:29:37 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:49.294 13:29:37 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:49.294 13:29:37 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:49.294 13:29:37 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:49.294 13:29:37 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:53.484 13:29:41 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:53.484 13:29:41 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:53.484 13:29:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:53.484 13:29:41 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:53.484 13:29:41 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:53.484 13:29:41 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:56.013 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:56.013 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 Hugepages 00:03:56.014 node hugesize free / total 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 00:03:56.014 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.014 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.272 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:56.272 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.272 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.272 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.272 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:56.272 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.272 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.272 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ - == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # [[ - == nvme ]] 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:56.273 13:29:44 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:56.273 13:29:44 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:56.273 13:29:44 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:56.273 13:29:44 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:56.273 ************************************ 00:03:56.273 START TEST denied 00:03:56.273 ************************************ 00:03:56.273 13:29:44 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:03:56.273 13:29:44 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:03:56.273 13:29:44 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:56.273 13:29:44 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:03:56.273 13:29:44 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:56.273 13:29:44 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:00.509 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:04:00.509 13:29:48 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:04:00.509 13:29:48 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:00.509 13:29:48 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:00.509 13:29:48 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:04:00.509 13:29:48 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:04:00.509 13:29:48 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:00.509 13:29:48 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:00.509 13:29:48 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:00.509 13:29:48 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:00.509 13:29:48 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:04.704 00:04:04.704 real 0m8.109s 00:04:04.704 user 0m2.615s 00:04:04.704 sys 0m4.778s 00:04:04.704 13:29:52 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:04.704 13:29:52 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:04.704 ************************************ 00:04:04.704 END TEST denied 00:04:04.704 ************************************ 00:04:04.704 13:29:52 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:04.704 13:29:52 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:04.704 13:29:52 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:04.704 13:29:52 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:04.704 13:29:52 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:04.704 ************************************ 00:04:04.704 START TEST allowed 00:04:04.704 ************************************ 00:04:04.704 13:29:53 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:04:04.704 13:29:53 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:04:04.704 13:29:53 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:04.704 13:29:53 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:04:04.704 13:29:53 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:04.704 13:29:53 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:11.261 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:11.261 13:29:58 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:11.261 13:29:58 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:11.261 13:29:58 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:11.261 13:29:58 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:11.261 13:29:58 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:14.544 00:04:14.544 real 0m9.446s 00:04:14.544 user 0m2.499s 00:04:14.544 sys 0m4.683s 00:04:14.544 13:30:02 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:14.544 13:30:02 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:14.544 ************************************ 00:04:14.544 END TEST allowed 00:04:14.544 ************************************ 00:04:14.544 13:30:02 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:14.544 00:04:14.544 real 0m24.924s 00:04:14.544 user 0m7.849s 00:04:14.544 sys 0m14.393s 00:04:14.544 13:30:02 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:14.544 13:30:02 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:14.544 ************************************ 00:04:14.544 END TEST acl 00:04:14.544 ************************************ 00:04:14.544 13:30:02 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:14.544 13:30:02 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:14.544 13:30:02 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:14.544 13:30:02 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:14.544 13:30:02 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:14.544 ************************************ 00:04:14.544 START TEST hugepages 00:04:14.544 ************************************ 00:04:14.544 13:30:02 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:14.544 * Looking for test storage... 00:04:14.544 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 76873704 kB' 'MemAvailable: 80128392 kB' 'Buffers: 11136 kB' 'Cached: 9188132 kB' 'SwapCached: 0 kB' 'Active: 6448920 kB' 'Inactive: 3437340 kB' 'Active(anon): 6065632 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 690448 kB' 'Mapped: 149356 kB' 'Shmem: 5378640 kB' 'KReclaimable: 175404 kB' 'Slab: 434288 kB' 'SReclaimable: 175404 kB' 'SUnreclaim: 258884 kB' 'KernelStack: 16160 kB' 'PageTables: 8684 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438188 kB' 'Committed_AS: 7645228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198548 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.544 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:14.545 13:30:02 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:14.545 13:30:02 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:14.545 13:30:02 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:14.545 13:30:02 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:14.545 ************************************ 00:04:14.545 START TEST default_setup 00:04:14.545 ************************************ 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:14.545 13:30:02 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:17.835 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:17.835 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:17.835 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:17.835 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:20.374 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79034460 kB' 'MemAvailable: 82288808 kB' 'Buffers: 11136 kB' 'Cached: 9188240 kB' 'SwapCached: 0 kB' 'Active: 6461340 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078052 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702184 kB' 'Mapped: 148768 kB' 'Shmem: 5378748 kB' 'KReclaimable: 174724 kB' 'Slab: 432252 kB' 'SReclaimable: 174724 kB' 'SUnreclaim: 257528 kB' 'KernelStack: 16192 kB' 'PageTables: 8152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7657880 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198704 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.374 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79041644 kB' 'MemAvailable: 82295984 kB' 'Buffers: 11136 kB' 'Cached: 9188244 kB' 'SwapCached: 0 kB' 'Active: 6462180 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078892 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 703472 kB' 'Mapped: 149200 kB' 'Shmem: 5378752 kB' 'KReclaimable: 174708 kB' 'Slab: 432204 kB' 'SReclaimable: 174708 kB' 'SUnreclaim: 257496 kB' 'KernelStack: 16192 kB' 'PageTables: 7936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7660312 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.376 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79040016 kB' 'MemAvailable: 82294356 kB' 'Buffers: 11136 kB' 'Cached: 9188260 kB' 'SwapCached: 0 kB' 'Active: 6465552 kB' 'Inactive: 3437340 kB' 'Active(anon): 6082264 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 706856 kB' 'Mapped: 149520 kB' 'Shmem: 5378768 kB' 'KReclaimable: 174708 kB' 'Slab: 432236 kB' 'SReclaimable: 174708 kB' 'SUnreclaim: 257528 kB' 'KernelStack: 16256 kB' 'PageTables: 8840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7662552 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198644 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.377 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.378 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:20.379 nr_hugepages=1024 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:20.379 resv_hugepages=0 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:20.379 surplus_hugepages=0 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:20.379 anon_hugepages=0 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:20.379 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79048920 kB' 'MemAvailable: 82303260 kB' 'Buffers: 11136 kB' 'Cached: 9188284 kB' 'SwapCached: 0 kB' 'Active: 6460392 kB' 'Inactive: 3437340 kB' 'Active(anon): 6077104 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701676 kB' 'Mapped: 148612 kB' 'Shmem: 5378792 kB' 'KReclaimable: 174708 kB' 'Slab: 432236 kB' 'SReclaimable: 174708 kB' 'SUnreclaim: 257528 kB' 'KernelStack: 16224 kB' 'PageTables: 8528 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7657944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198736 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.380 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.381 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36589708 kB' 'MemUsed: 11527232 kB' 'SwapCached: 0 kB' 'Active: 5375048 kB' 'Inactive: 3368240 kB' 'Active(anon): 5223372 kB' 'Inactive(anon): 0 kB' 'Active(file): 151676 kB' 'Inactive(file): 3368240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8301272 kB' 'Mapped: 75080 kB' 'AnonPages: 445220 kB' 'Shmem: 4781356 kB' 'KernelStack: 8936 kB' 'PageTables: 4776 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105284 kB' 'Slab: 271412 kB' 'SReclaimable: 105284 kB' 'SUnreclaim: 166128 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.382 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:20.383 node0=1024 expecting 1024 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:20.383 00:04:20.383 real 0m5.938s 00:04:20.383 user 0m1.375s 00:04:20.383 sys 0m2.381s 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:20.383 13:30:08 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:20.383 ************************************ 00:04:20.383 END TEST default_setup 00:04:20.383 ************************************ 00:04:20.383 13:30:08 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:20.383 13:30:08 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:20.383 13:30:08 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:20.383 13:30:08 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:20.383 13:30:08 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:20.383 ************************************ 00:04:20.383 START TEST per_node_1G_alloc 00:04:20.383 ************************************ 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:20.383 13:30:08 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:23.693 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:23.693 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:23.693 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:23.693 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:23.693 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79063700 kB' 'MemAvailable: 82318020 kB' 'Buffers: 11136 kB' 'Cached: 9188372 kB' 'SwapCached: 0 kB' 'Active: 6462008 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078720 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 703036 kB' 'Mapped: 148652 kB' 'Shmem: 5378880 kB' 'KReclaimable: 174668 kB' 'Slab: 433064 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258396 kB' 'KernelStack: 16160 kB' 'PageTables: 8460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7655800 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198704 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.693 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79064404 kB' 'MemAvailable: 82318724 kB' 'Buffers: 11136 kB' 'Cached: 9188392 kB' 'SwapCached: 0 kB' 'Active: 6461352 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078064 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702380 kB' 'Mapped: 148616 kB' 'Shmem: 5378900 kB' 'KReclaimable: 174668 kB' 'Slab: 433112 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258444 kB' 'KernelStack: 16128 kB' 'PageTables: 8368 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7655820 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.694 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.695 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79065216 kB' 'MemAvailable: 82319536 kB' 'Buffers: 11136 kB' 'Cached: 9188392 kB' 'SwapCached: 0 kB' 'Active: 6461444 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078156 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702448 kB' 'Mapped: 148616 kB' 'Shmem: 5378900 kB' 'KReclaimable: 174668 kB' 'Slab: 433112 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258444 kB' 'KernelStack: 16144 kB' 'PageTables: 8420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7655840 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.696 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.697 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:23.698 nr_hugepages=1024 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:23.698 resv_hugepages=0 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:23.698 surplus_hugepages=0 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:23.698 anon_hugepages=0 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79065680 kB' 'MemAvailable: 82320000 kB' 'Buffers: 11136 kB' 'Cached: 9188436 kB' 'SwapCached: 0 kB' 'Active: 6461104 kB' 'Inactive: 3437340 kB' 'Active(anon): 6077816 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702048 kB' 'Mapped: 148616 kB' 'Shmem: 5378944 kB' 'KReclaimable: 174668 kB' 'Slab: 433112 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258444 kB' 'KernelStack: 16128 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7655864 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.698 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.699 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:23.960 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37635664 kB' 'MemUsed: 10481276 kB' 'SwapCached: 0 kB' 'Active: 5375816 kB' 'Inactive: 3368240 kB' 'Active(anon): 5224140 kB' 'Inactive(anon): 0 kB' 'Active(file): 151676 kB' 'Inactive(file): 3368240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8301380 kB' 'Mapped: 75076 kB' 'AnonPages: 445884 kB' 'Shmem: 4781464 kB' 'KernelStack: 8936 kB' 'PageTables: 4780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105252 kB' 'Slab: 272176 kB' 'SReclaimable: 105252 kB' 'SUnreclaim: 166924 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.961 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41429792 kB' 'MemUsed: 2746740 kB' 'SwapCached: 0 kB' 'Active: 1085984 kB' 'Inactive: 69100 kB' 'Active(anon): 854372 kB' 'Inactive(anon): 0 kB' 'Active(file): 231612 kB' 'Inactive(file): 69100 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 898192 kB' 'Mapped: 73540 kB' 'AnonPages: 256896 kB' 'Shmem: 597480 kB' 'KernelStack: 7224 kB' 'PageTables: 3692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 69416 kB' 'Slab: 160936 kB' 'SReclaimable: 69416 kB' 'SUnreclaim: 91520 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.962 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:23.963 node0=512 expecting 512 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:23.963 node1=512 expecting 512 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:23.963 00:04:23.963 real 0m3.559s 00:04:23.963 user 0m1.485s 00:04:23.963 sys 0m2.167s 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:23.963 13:30:12 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:23.963 ************************************ 00:04:23.963 END TEST per_node_1G_alloc 00:04:23.963 ************************************ 00:04:23.963 13:30:12 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:23.963 13:30:12 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:23.963 13:30:12 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:23.963 13:30:12 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:23.963 13:30:12 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:23.963 ************************************ 00:04:23.963 START TEST even_2G_alloc 00:04:23.963 ************************************ 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:23.963 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:23.964 13:30:12 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:27.252 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:27.252 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:27.252 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:27.252 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:27.252 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.252 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79086692 kB' 'MemAvailable: 82341012 kB' 'Buffers: 11136 kB' 'Cached: 9188528 kB' 'SwapCached: 0 kB' 'Active: 6459556 kB' 'Inactive: 3437340 kB' 'Active(anon): 6076268 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 700512 kB' 'Mapped: 147616 kB' 'Shmem: 5379036 kB' 'KReclaimable: 174668 kB' 'Slab: 432180 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 257512 kB' 'KernelStack: 16064 kB' 'PageTables: 8116 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7646904 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198624 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.253 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79087956 kB' 'MemAvailable: 82342276 kB' 'Buffers: 11136 kB' 'Cached: 9188532 kB' 'SwapCached: 0 kB' 'Active: 6459304 kB' 'Inactive: 3437340 kB' 'Active(anon): 6076016 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 700252 kB' 'Mapped: 147584 kB' 'Shmem: 5379040 kB' 'KReclaimable: 174668 kB' 'Slab: 432208 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 257540 kB' 'KernelStack: 16048 kB' 'PageTables: 8068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7646920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198592 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.254 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.255 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79087704 kB' 'MemAvailable: 82342024 kB' 'Buffers: 11136 kB' 'Cached: 9188532 kB' 'SwapCached: 0 kB' 'Active: 6459336 kB' 'Inactive: 3437340 kB' 'Active(anon): 6076048 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 700280 kB' 'Mapped: 147584 kB' 'Shmem: 5379040 kB' 'KReclaimable: 174668 kB' 'Slab: 432208 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 257540 kB' 'KernelStack: 16064 kB' 'PageTables: 8120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7646940 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198608 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.256 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.257 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:27.258 nr_hugepages=1024 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:27.258 resv_hugepages=0 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:27.258 surplus_hugepages=0 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:27.258 anon_hugepages=0 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79087452 kB' 'MemAvailable: 82341772 kB' 'Buffers: 11136 kB' 'Cached: 9188572 kB' 'SwapCached: 0 kB' 'Active: 6459356 kB' 'Inactive: 3437340 kB' 'Active(anon): 6076068 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 700248 kB' 'Mapped: 147584 kB' 'Shmem: 5379080 kB' 'KReclaimable: 174668 kB' 'Slab: 432208 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 257540 kB' 'KernelStack: 16048 kB' 'PageTables: 8068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7646964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198608 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.258 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.259 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37628776 kB' 'MemUsed: 10488164 kB' 'SwapCached: 0 kB' 'Active: 5374488 kB' 'Inactive: 3368240 kB' 'Active(anon): 5222812 kB' 'Inactive(anon): 0 kB' 'Active(file): 151676 kB' 'Inactive(file): 3368240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8301440 kB' 'Mapped: 74856 kB' 'AnonPages: 444420 kB' 'Shmem: 4781524 kB' 'KernelStack: 8856 kB' 'PageTables: 4592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105252 kB' 'Slab: 271616 kB' 'SReclaimable: 105252 kB' 'SUnreclaim: 166364 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.260 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.520 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41459808 kB' 'MemUsed: 2716724 kB' 'SwapCached: 0 kB' 'Active: 1084920 kB' 'Inactive: 69100 kB' 'Active(anon): 853308 kB' 'Inactive(anon): 0 kB' 'Active(file): 231612 kB' 'Inactive(file): 69100 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 898312 kB' 'Mapped: 72728 kB' 'AnonPages: 255828 kB' 'Shmem: 597600 kB' 'KernelStack: 7192 kB' 'PageTables: 3476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 69416 kB' 'Slab: 160592 kB' 'SReclaimable: 69416 kB' 'SUnreclaim: 91176 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.521 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:27.522 node0=512 expecting 512 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:27.522 node1=512 expecting 512 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:27.522 00:04:27.522 real 0m3.401s 00:04:27.522 user 0m1.243s 00:04:27.522 sys 0m2.245s 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:27.522 13:30:15 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:27.522 ************************************ 00:04:27.522 END TEST even_2G_alloc 00:04:27.522 ************************************ 00:04:27.522 13:30:15 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:27.522 13:30:15 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:27.522 13:30:15 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:27.522 13:30:15 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:27.522 13:30:15 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:27.522 ************************************ 00:04:27.522 START TEST odd_alloc 00:04:27.522 ************************************ 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:27.522 13:30:15 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:30.812 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:30.812 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:30.812 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:30.812 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:30.812 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79097480 kB' 'MemAvailable: 82351800 kB' 'Buffers: 11136 kB' 'Cached: 9188676 kB' 'SwapCached: 0 kB' 'Active: 6461772 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078484 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701984 kB' 'Mapped: 147740 kB' 'Shmem: 5379184 kB' 'KReclaimable: 174668 kB' 'Slab: 432380 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 257712 kB' 'KernelStack: 16048 kB' 'PageTables: 8036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7647512 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198640 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.812 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.813 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79099548 kB' 'MemAvailable: 82353868 kB' 'Buffers: 11136 kB' 'Cached: 9188696 kB' 'SwapCached: 0 kB' 'Active: 6461184 kB' 'Inactive: 3437340 kB' 'Active(anon): 6077896 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701372 kB' 'Mapped: 147668 kB' 'Shmem: 5379204 kB' 'KReclaimable: 174668 kB' 'Slab: 432380 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 257712 kB' 'KernelStack: 16032 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7647528 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198608 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.814 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:30.815 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79099816 kB' 'MemAvailable: 82354136 kB' 'Buffers: 11136 kB' 'Cached: 9188696 kB' 'SwapCached: 0 kB' 'Active: 6461908 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078620 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702188 kB' 'Mapped: 147668 kB' 'Shmem: 5379204 kB' 'KReclaimable: 174668 kB' 'Slab: 432380 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 257712 kB' 'KernelStack: 16016 kB' 'PageTables: 7964 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7648832 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198608 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.816 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.817 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:30.818 nr_hugepages=1025 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:30.818 resv_hugepages=0 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:30.818 surplus_hugepages=0 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:30.818 anon_hugepages=0 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79100068 kB' 'MemAvailable: 82354388 kB' 'Buffers: 11136 kB' 'Cached: 9188716 kB' 'SwapCached: 0 kB' 'Active: 6461548 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078260 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701800 kB' 'Mapped: 147668 kB' 'Shmem: 5379224 kB' 'KReclaimable: 174668 kB' 'Slab: 432380 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 257712 kB' 'KernelStack: 16160 kB' 'PageTables: 8152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7649928 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198720 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.818 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.819 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37647384 kB' 'MemUsed: 10469556 kB' 'SwapCached: 0 kB' 'Active: 5375468 kB' 'Inactive: 3368240 kB' 'Active(anon): 5223792 kB' 'Inactive(anon): 0 kB' 'Active(file): 151676 kB' 'Inactive(file): 3368240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8301516 kB' 'Mapped: 74868 kB' 'AnonPages: 445332 kB' 'Shmem: 4781600 kB' 'KernelStack: 8792 kB' 'PageTables: 4144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105252 kB' 'Slab: 271632 kB' 'SReclaimable: 105252 kB' 'SUnreclaim: 166380 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.820 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41453612 kB' 'MemUsed: 2722920 kB' 'SwapCached: 0 kB' 'Active: 1085864 kB' 'Inactive: 69100 kB' 'Active(anon): 854252 kB' 'Inactive(anon): 0 kB' 'Active(file): 231612 kB' 'Inactive(file): 69100 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 898360 kB' 'Mapped: 72800 kB' 'AnonPages: 256144 kB' 'Shmem: 597648 kB' 'KernelStack: 7336 kB' 'PageTables: 3516 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 69416 kB' 'Slab: 160740 kB' 'SReclaimable: 69416 kB' 'SUnreclaim: 91324 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.821 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:30.822 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.082 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:31.083 node0=512 expecting 513 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:31.083 node1=513 expecting 512 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:31.083 00:04:31.083 real 0m3.458s 00:04:31.083 user 0m1.309s 00:04:31.083 sys 0m2.238s 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:31.083 13:30:19 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:31.083 ************************************ 00:04:31.083 END TEST odd_alloc 00:04:31.083 ************************************ 00:04:31.083 13:30:19 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:31.083 13:30:19 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:31.083 13:30:19 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:31.083 13:30:19 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:31.083 13:30:19 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:31.083 ************************************ 00:04:31.083 START TEST custom_alloc 00:04:31.083 ************************************ 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:31.083 13:30:19 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:34.375 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:34.375 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:34.375 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:34.375 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:34.375 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:34.375 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78022324 kB' 'MemAvailable: 81276644 kB' 'Buffers: 11136 kB' 'Cached: 9188824 kB' 'SwapCached: 0 kB' 'Active: 6462096 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078808 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702688 kB' 'Mapped: 148084 kB' 'Shmem: 5379332 kB' 'KReclaimable: 174668 kB' 'Slab: 432772 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258104 kB' 'KernelStack: 16064 kB' 'PageTables: 8120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7647704 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198704 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.376 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78022504 kB' 'MemAvailable: 81276824 kB' 'Buffers: 11136 kB' 'Cached: 9188828 kB' 'SwapCached: 0 kB' 'Active: 6461556 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078268 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702140 kB' 'Mapped: 147604 kB' 'Shmem: 5379336 kB' 'KReclaimable: 174668 kB' 'Slab: 432752 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258084 kB' 'KernelStack: 16064 kB' 'PageTables: 8076 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7647724 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198688 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.377 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.378 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78022896 kB' 'MemAvailable: 81277216 kB' 'Buffers: 11136 kB' 'Cached: 9188864 kB' 'SwapCached: 0 kB' 'Active: 6461416 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078128 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701932 kB' 'Mapped: 147604 kB' 'Shmem: 5379372 kB' 'KReclaimable: 174668 kB' 'Slab: 432752 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258084 kB' 'KernelStack: 16048 kB' 'PageTables: 8024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7647744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198688 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.379 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.380 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:34.381 nr_hugepages=1536 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:34.381 resv_hugepages=0 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:34.381 surplus_hugepages=0 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:34.381 anon_hugepages=0 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.381 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78023264 kB' 'MemAvailable: 81277584 kB' 'Buffers: 11136 kB' 'Cached: 9188884 kB' 'SwapCached: 0 kB' 'Active: 6461260 kB' 'Inactive: 3437340 kB' 'Active(anon): 6077972 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701756 kB' 'Mapped: 147604 kB' 'Shmem: 5379392 kB' 'KReclaimable: 174668 kB' 'Slab: 432752 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258084 kB' 'KernelStack: 16048 kB' 'PageTables: 8020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7647768 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198688 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.382 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.642 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37625016 kB' 'MemUsed: 10491924 kB' 'SwapCached: 0 kB' 'Active: 5375640 kB' 'Inactive: 3368240 kB' 'Active(anon): 5223964 kB' 'Inactive(anon): 0 kB' 'Active(file): 151676 kB' 'Inactive(file): 3368240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8301640 kB' 'Mapped: 74880 kB' 'AnonPages: 445364 kB' 'Shmem: 4781724 kB' 'KernelStack: 8872 kB' 'PageTables: 4512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105252 kB' 'Slab: 271804 kB' 'SReclaimable: 105252 kB' 'SUnreclaim: 166552 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.643 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 40401576 kB' 'MemUsed: 3774956 kB' 'SwapCached: 0 kB' 'Active: 1085968 kB' 'Inactive: 69100 kB' 'Active(anon): 854356 kB' 'Inactive(anon): 0 kB' 'Active(file): 231612 kB' 'Inactive(file): 69100 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 898384 kB' 'Mapped: 72724 kB' 'AnonPages: 256756 kB' 'Shmem: 597672 kB' 'KernelStack: 7160 kB' 'PageTables: 3452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 69416 kB' 'Slab: 160948 kB' 'SReclaimable: 69416 kB' 'SUnreclaim: 91532 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.644 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:34.645 node0=512 expecting 512 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:34.645 node1=1024 expecting 1024 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:34.645 00:04:34.645 real 0m3.542s 00:04:34.645 user 0m1.392s 00:04:34.645 sys 0m2.246s 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:34.645 13:30:23 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:34.645 ************************************ 00:04:34.646 END TEST custom_alloc 00:04:34.646 ************************************ 00:04:34.646 13:30:23 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:34.646 13:30:23 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:34.646 13:30:23 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:34.646 13:30:23 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:34.646 13:30:23 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:34.646 ************************************ 00:04:34.646 START TEST no_shrink_alloc 00:04:34.646 ************************************ 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:34.646 13:30:23 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:37.932 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:37.932 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:37.932 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:37.932 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:37.932 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78963852 kB' 'MemAvailable: 82218172 kB' 'Buffers: 11136 kB' 'Cached: 9188972 kB' 'SwapCached: 0 kB' 'Active: 6463132 kB' 'Inactive: 3437340 kB' 'Active(anon): 6079844 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702980 kB' 'Mapped: 147680 kB' 'Shmem: 5379480 kB' 'KReclaimable: 174668 kB' 'Slab: 433488 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258820 kB' 'KernelStack: 16064 kB' 'PageTables: 8284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7648236 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198704 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.932 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.933 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78963852 kB' 'MemAvailable: 82218172 kB' 'Buffers: 11136 kB' 'Cached: 9188972 kB' 'SwapCached: 0 kB' 'Active: 6463012 kB' 'Inactive: 3437340 kB' 'Active(anon): 6079724 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 703460 kB' 'Mapped: 147616 kB' 'Shmem: 5379480 kB' 'KReclaimable: 174668 kB' 'Slab: 433472 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258804 kB' 'KernelStack: 16000 kB' 'PageTables: 8108 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7648256 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198640 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.934 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78965720 kB' 'MemAvailable: 82220040 kB' 'Buffers: 11136 kB' 'Cached: 9188988 kB' 'SwapCached: 0 kB' 'Active: 6462568 kB' 'Inactive: 3437340 kB' 'Active(anon): 6079280 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 703056 kB' 'Mapped: 147616 kB' 'Shmem: 5379496 kB' 'KReclaimable: 174668 kB' 'Slab: 433568 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258900 kB' 'KernelStack: 16000 kB' 'PageTables: 7904 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7647908 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198576 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.935 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.936 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:37.937 nr_hugepages=1024 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:37.937 resv_hugepages=0 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:37.937 surplus_hugepages=0 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:37.937 anon_hugepages=0 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78966796 kB' 'MemAvailable: 82221116 kB' 'Buffers: 11136 kB' 'Cached: 9189032 kB' 'SwapCached: 0 kB' 'Active: 6461576 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078288 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 701968 kB' 'Mapped: 147616 kB' 'Shmem: 5379540 kB' 'KReclaimable: 174668 kB' 'Slab: 433568 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258900 kB' 'KernelStack: 15984 kB' 'PageTables: 7832 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7647936 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198592 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.937 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.197 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.198 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36570316 kB' 'MemUsed: 11546624 kB' 'SwapCached: 0 kB' 'Active: 5376356 kB' 'Inactive: 3368240 kB' 'Active(anon): 5224680 kB' 'Inactive(anon): 0 kB' 'Active(file): 151676 kB' 'Inactive(file): 3368240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8301756 kB' 'Mapped: 74892 kB' 'AnonPages: 445984 kB' 'Shmem: 4781840 kB' 'KernelStack: 8872 kB' 'PageTables: 4604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105252 kB' 'Slab: 272268 kB' 'SReclaimable: 105252 kB' 'SUnreclaim: 167016 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.199 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:38.200 node0=1024 expecting 1024 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:38.200 13:30:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:41.490 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:41.490 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:41.490 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:41.490 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:41.490 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:41.490 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78975784 kB' 'MemAvailable: 82230104 kB' 'Buffers: 11136 kB' 'Cached: 9189104 kB' 'SwapCached: 0 kB' 'Active: 6463144 kB' 'Inactive: 3437340 kB' 'Active(anon): 6079856 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 703432 kB' 'Mapped: 147676 kB' 'Shmem: 5379612 kB' 'KReclaimable: 174668 kB' 'Slab: 433396 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258728 kB' 'KernelStack: 16048 kB' 'PageTables: 8052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7649096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198672 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.490 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.491 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78977512 kB' 'MemAvailable: 82231832 kB' 'Buffers: 11136 kB' 'Cached: 9189104 kB' 'SwapCached: 0 kB' 'Active: 6463156 kB' 'Inactive: 3437340 kB' 'Active(anon): 6079868 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 703424 kB' 'Mapped: 147628 kB' 'Shmem: 5379612 kB' 'KReclaimable: 174668 kB' 'Slab: 433412 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258744 kB' 'KernelStack: 16080 kB' 'PageTables: 8216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7648744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198656 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.492 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.493 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78977528 kB' 'MemAvailable: 82231848 kB' 'Buffers: 11136 kB' 'Cached: 9189144 kB' 'SwapCached: 0 kB' 'Active: 6462156 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078868 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702356 kB' 'Mapped: 147628 kB' 'Shmem: 5379652 kB' 'KReclaimable: 174668 kB' 'Slab: 433412 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258744 kB' 'KernelStack: 15984 kB' 'PageTables: 7836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7648772 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198624 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.494 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.495 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.496 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:41.497 nr_hugepages=1024 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:41.497 resv_hugepages=0 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:41.497 surplus_hugepages=0 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:41.497 anon_hugepages=0 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78977844 kB' 'MemAvailable: 82232164 kB' 'Buffers: 11136 kB' 'Cached: 9189164 kB' 'SwapCached: 0 kB' 'Active: 6462172 kB' 'Inactive: 3437340 kB' 'Active(anon): 6078884 kB' 'Inactive(anon): 0 kB' 'Active(file): 383288 kB' 'Inactive(file): 3437340 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702356 kB' 'Mapped: 147628 kB' 'Shmem: 5379672 kB' 'KReclaimable: 174668 kB' 'Slab: 433412 kB' 'SReclaimable: 174668 kB' 'SUnreclaim: 258744 kB' 'KernelStack: 15984 kB' 'PageTables: 7836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7648792 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198624 kB' 'VmallocChunk: 0 kB' 'Percpu: 46400 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 382372 kB' 'DirectMap2M: 8730624 kB' 'DirectMap1G: 92274688 kB' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.497 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.498 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.498 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.498 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.498 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36566424 kB' 'MemUsed: 11550516 kB' 'SwapCached: 0 kB' 'Active: 5376692 kB' 'Inactive: 3368240 kB' 'Active(anon): 5225016 kB' 'Inactive(anon): 0 kB' 'Active(file): 151676 kB' 'Inactive(file): 3368240 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8301864 kB' 'Mapped: 74904 kB' 'AnonPages: 446452 kB' 'Shmem: 4781948 kB' 'KernelStack: 8856 kB' 'PageTables: 4608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 105252 kB' 'Slab: 272216 kB' 'SReclaimable: 105252 kB' 'SUnreclaim: 166964 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.499 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.500 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.501 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:41.759 node0=1024 expecting 1024 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:41.759 00:04:41.759 real 0m6.946s 00:04:41.759 user 0m2.719s 00:04:41.759 sys 0m4.413s 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:41.759 13:30:30 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:41.759 ************************************ 00:04:41.759 END TEST no_shrink_alloc 00:04:41.759 ************************************ 00:04:41.759 13:30:30 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:41.759 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:41.759 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:41.759 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:41.759 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:41.759 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:41.759 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:41.760 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:41.760 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:41.760 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:41.760 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:41.760 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:41.760 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:41.760 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:41.760 13:30:30 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:41.760 00:04:41.760 real 0m27.548s 00:04:41.760 user 0m9.798s 00:04:41.760 sys 0m16.170s 00:04:41.760 13:30:30 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:41.760 13:30:30 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:41.760 ************************************ 00:04:41.760 END TEST hugepages 00:04:41.760 ************************************ 00:04:41.760 13:30:30 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:41.760 13:30:30 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:41.760 13:30:30 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:41.760 13:30:30 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:41.760 13:30:30 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:41.760 ************************************ 00:04:41.760 START TEST driver 00:04:41.760 ************************************ 00:04:41.760 13:30:30 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:41.760 * Looking for test storage... 00:04:41.760 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:41.760 13:30:30 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:41.760 13:30:30 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:41.760 13:30:30 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:47.029 13:30:34 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:47.029 13:30:34 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:47.029 13:30:34 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:47.029 13:30:34 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:47.029 ************************************ 00:04:47.029 START TEST guess_driver 00:04:47.029 ************************************ 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 168 > 0 )) 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:47.029 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:47.029 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:47.029 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:47.029 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:47.029 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:47.029 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:47.029 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:47.029 Looking for driver=vfio-pci 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:47.029 13:30:34 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:49.592 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:49.592 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:49.592 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.592 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:04:49.592 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:49.592 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.850 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.851 13:30:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.383 13:30:40 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:52.383 13:30:40 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:52.383 13:30:40 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:52.383 13:30:40 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:52.383 13:30:40 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:52.383 13:30:40 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:52.383 13:30:40 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:57.653 00:04:57.653 real 0m10.494s 00:04:57.653 user 0m2.616s 00:04:57.653 sys 0m4.897s 00:04:57.653 13:30:45 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:57.653 13:30:45 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:57.653 ************************************ 00:04:57.653 END TEST guess_driver 00:04:57.653 ************************************ 00:04:57.653 13:30:45 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:04:57.653 00:04:57.653 real 0m15.294s 00:04:57.653 user 0m3.964s 00:04:57.653 sys 0m7.575s 00:04:57.653 13:30:45 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:57.653 13:30:45 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:57.653 ************************************ 00:04:57.653 END TEST driver 00:04:57.653 ************************************ 00:04:57.653 13:30:45 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:57.653 13:30:45 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:57.653 13:30:45 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:57.653 13:30:45 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:57.653 13:30:45 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:57.653 ************************************ 00:04:57.653 START TEST devices 00:04:57.653 ************************************ 00:04:57.653 13:30:45 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:57.653 * Looking for test storage... 00:04:57.653 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:57.653 13:30:45 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:57.653 13:30:45 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:57.653 13:30:45 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:57.653 13:30:45 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:00.938 13:30:49 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:00.938 13:30:49 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:00.938 No valid GPT data, bailing 00:05:00.938 13:30:49 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:00.938 13:30:49 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:00.938 13:30:49 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:00.938 13:30:49 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:00.938 13:30:49 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:00.938 13:30:49 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:00.938 13:30:49 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:00.938 13:30:49 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:01.197 ************************************ 00:05:01.197 START TEST nvme_mount 00:05:01.197 ************************************ 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:01.197 13:30:49 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:02.135 Creating new GPT entries in memory. 00:05:02.135 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:02.135 other utilities. 00:05:02.135 13:30:50 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:02.135 13:30:50 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:02.135 13:30:50 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:02.135 13:30:50 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:02.135 13:30:50 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:03.072 Creating new GPT entries in memory. 00:05:03.072 The operation has completed successfully. 00:05:03.072 13:30:51 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:03.072 13:30:51 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:03.072 13:30:51 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 371984 00:05:03.072 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:03.072 13:30:51 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:03.072 13:30:51 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:03.072 13:30:51 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:03.072 13:30:51 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.073 13:30:51 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:06.363 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:06.622 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:06.622 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:06.622 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:06.622 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:06.622 13:30:54 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:06.882 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:06.882 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:06.882 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:06.882 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:06.882 13:30:55 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.336 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:10.337 13:30:58 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.806 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:13.807 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:13.807 00:05:13.807 real 0m12.359s 00:05:13.807 user 0m3.664s 00:05:13.807 sys 0m6.676s 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:13.807 13:31:01 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:13.807 ************************************ 00:05:13.807 END TEST nvme_mount 00:05:13.807 ************************************ 00:05:13.807 13:31:01 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:13.807 13:31:01 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:13.807 13:31:01 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:13.807 13:31:01 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:13.807 13:31:01 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:13.807 ************************************ 00:05:13.807 START TEST dm_mount 00:05:13.807 ************************************ 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:13.807 13:31:01 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:14.458 Creating new GPT entries in memory. 00:05:14.458 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:14.458 other utilities. 00:05:14.458 13:31:02 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:14.458 13:31:02 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:14.458 13:31:02 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:14.458 13:31:02 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:14.458 13:31:02 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:15.834 Creating new GPT entries in memory. 00:05:15.834 The operation has completed successfully. 00:05:15.834 13:31:04 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:15.834 13:31:04 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:15.834 13:31:04 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:15.834 13:31:04 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:15.834 13:31:04 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:16.782 The operation has completed successfully. 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 375806 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:16.782 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:16.783 13:31:05 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.079 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:20.080 13:31:08 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.370 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:23.371 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:23.371 00:05:23.371 real 0m9.657s 00:05:23.371 user 0m2.345s 00:05:23.371 sys 0m4.424s 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:23.371 13:31:11 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:23.371 ************************************ 00:05:23.371 END TEST dm_mount 00:05:23.371 ************************************ 00:05:23.371 13:31:11 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:23.371 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:23.371 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:23.371 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:23.371 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:23.371 13:31:11 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:23.630 13:31:11 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:23.630 13:31:11 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:23.630 00:05:23.630 real 0m26.361s 00:05:23.630 user 0m7.522s 00:05:23.630 sys 0m13.861s 00:05:23.630 13:31:11 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:23.630 13:31:11 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:23.630 ************************************ 00:05:23.630 END TEST devices 00:05:23.630 ************************************ 00:05:23.630 13:31:12 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:23.630 00:05:23.630 real 1m34.592s 00:05:23.630 user 0m29.312s 00:05:23.630 sys 0m52.317s 00:05:23.630 13:31:12 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:23.630 13:31:12 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:23.630 ************************************ 00:05:23.630 END TEST setup.sh 00:05:23.630 ************************************ 00:05:23.630 13:31:12 -- common/autotest_common.sh@1142 -- # return 0 00:05:23.630 13:31:12 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:26.917 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:26.917 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:26.917 Hugepages 00:05:26.917 node hugesize free / total 00:05:26.917 node0 1048576kB 0 / 0 00:05:26.917 node0 2048kB 1024 / 1024 00:05:26.917 node1 1048576kB 0 / 0 00:05:26.917 node1 2048kB 1024 / 1024 00:05:26.917 00:05:26.917 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:26.917 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:26.917 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:26.917 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:26.917 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:26.917 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:26.917 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:26.917 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:26.918 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:26.918 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:05:26.918 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:26.918 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:26.918 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:26.918 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:26.918 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:26.918 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:26.918 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:26.918 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:26.918 VMD 0000:85:05.5 8086 201d 1 - - - 00:05:26.918 VMD 0000:d7:05.5 8086 201d 1 - - - 00:05:26.918 13:31:15 -- spdk/autotest.sh@130 -- # uname -s 00:05:26.918 13:31:15 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:26.918 13:31:15 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:26.918 13:31:15 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:30.203 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:30.203 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:30.203 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:30.203 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:32.738 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:32.738 13:31:21 -- common/autotest_common.sh@1532 -- # sleep 1 00:05:33.675 13:31:22 -- common/autotest_common.sh@1533 -- # bdfs=() 00:05:33.675 13:31:22 -- common/autotest_common.sh@1533 -- # local bdfs 00:05:33.675 13:31:22 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:05:33.675 13:31:22 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:05:33.675 13:31:22 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:33.675 13:31:22 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:33.675 13:31:22 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:33.675 13:31:22 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:33.675 13:31:22 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:33.675 13:31:22 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:33.675 13:31:22 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:33.675 13:31:22 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:36.997 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:36.997 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:36.997 Waiting for block devices as requested 00:05:36.997 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:05:36.998 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:36.998 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:36.998 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:37.256 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:37.256 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:37.256 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:37.515 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:37.515 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:37.515 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:37.773 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:37.773 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:37.773 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:38.032 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:38.032 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:38.032 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:38.291 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:38.291 13:31:26 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:05:38.291 13:31:26 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:05:38.291 13:31:26 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:05:38.291 13:31:26 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:05:38.291 13:31:26 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:38.291 13:31:26 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:05:38.291 13:31:26 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:38.291 13:31:26 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:05:38.291 13:31:26 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:05:38.291 13:31:26 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:05:38.291 13:31:26 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:05:38.291 13:31:26 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:05:38.291 13:31:26 -- common/autotest_common.sh@1545 -- # grep oacs 00:05:38.291 13:31:26 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:05:38.291 13:31:26 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:05:38.291 13:31:26 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:05:38.291 13:31:26 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:05:38.291 13:31:26 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:05:38.291 13:31:26 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:05:38.291 13:31:26 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:05:38.292 13:31:26 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:05:38.292 13:31:26 -- common/autotest_common.sh@1557 -- # continue 00:05:38.292 13:31:26 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:38.292 13:31:26 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:38.292 13:31:26 -- common/autotest_common.sh@10 -- # set +x 00:05:38.292 13:31:26 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:38.292 13:31:26 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:38.292 13:31:26 -- common/autotest_common.sh@10 -- # set +x 00:05:38.292 13:31:26 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:41.582 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:41.582 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:41.582 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:41.582 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:44.114 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:44.373 13:31:32 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:44.373 13:31:32 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:44.373 13:31:32 -- common/autotest_common.sh@10 -- # set +x 00:05:44.373 13:31:32 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:44.373 13:31:32 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:05:44.373 13:31:32 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:05:44.373 13:31:32 -- common/autotest_common.sh@1577 -- # bdfs=() 00:05:44.373 13:31:32 -- common/autotest_common.sh@1577 -- # local bdfs 00:05:44.373 13:31:32 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:05:44.373 13:31:32 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:44.373 13:31:32 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:44.373 13:31:32 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:44.373 13:31:32 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:44.373 13:31:32 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:44.373 13:31:32 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:44.373 13:31:32 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:44.373 13:31:32 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:05:44.373 13:31:32 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:05:44.373 13:31:32 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:05:44.373 13:31:32 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:05:44.373 13:31:32 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:05:44.373 13:31:32 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:05:44.373 13:31:32 -- common/autotest_common.sh@1593 -- # return 0 00:05:44.373 13:31:32 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:44.373 13:31:32 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:44.373 13:31:32 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:44.373 13:31:32 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:44.373 13:31:32 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:45.307 Restarting all devices. 00:05:45.307 lstat() error: No such file or directory 00:05:45.307 QAT Error: No GENERAL section found 00:05:45.307 Failed to configure qat_dev0 00:05:45.307 lstat() error: No such file or directory 00:05:45.307 QAT Error: No GENERAL section found 00:05:45.307 Failed to configure qat_dev1 00:05:45.307 lstat() error: No such file or directory 00:05:45.307 QAT Error: No GENERAL section found 00:05:45.307 Failed to configure qat_dev2 00:05:45.307 enable sriov 00:05:45.307 Checking status of all devices. 00:05:45.307 There is 3 QAT acceleration device(s) in the system: 00:05:45.307 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:05:45.307 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:05:45.307 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:05:45.874 0000:3d:00.0 set to 16 VFs 00:05:46.809 0000:3f:00.0 set to 16 VFs 00:05:47.744 0000:da:00.0 set to 16 VFs 00:05:49.122 Properly configured the qat device with driver uio_pci_generic. 00:05:49.122 13:31:37 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:49.122 13:31:37 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:49.122 13:31:37 -- common/autotest_common.sh@10 -- # set +x 00:05:49.122 13:31:37 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:05:49.122 13:31:37 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:49.122 13:31:37 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:49.122 13:31:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.122 13:31:37 -- common/autotest_common.sh@10 -- # set +x 00:05:49.122 ************************************ 00:05:49.122 START TEST env 00:05:49.122 ************************************ 00:05:49.122 13:31:37 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:49.122 * Looking for test storage... 00:05:49.122 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:49.122 13:31:37 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:49.122 13:31:37 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:49.122 13:31:37 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.122 13:31:37 env -- common/autotest_common.sh@10 -- # set +x 00:05:49.122 ************************************ 00:05:49.122 START TEST env_memory 00:05:49.122 ************************************ 00:05:49.122 13:31:37 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:49.122 00:05:49.122 00:05:49.122 CUnit - A unit testing framework for C - Version 2.1-3 00:05:49.122 http://cunit.sourceforge.net/ 00:05:49.122 00:05:49.122 00:05:49.122 Suite: memory 00:05:49.122 Test: alloc and free memory map ...[2024-07-12 13:31:37.645523] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:49.122 passed 00:05:49.122 Test: mem map translation ...[2024-07-12 13:31:37.674769] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:49.122 [2024-07-12 13:31:37.674790] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:49.122 [2024-07-12 13:31:37.674845] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:49.122 [2024-07-12 13:31:37.674858] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:49.381 passed 00:05:49.381 Test: mem map registration ...[2024-07-12 13:31:37.732715] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:49.381 [2024-07-12 13:31:37.732737] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:49.381 passed 00:05:49.381 Test: mem map adjacent registrations ...passed 00:05:49.381 00:05:49.381 Run Summary: Type Total Ran Passed Failed Inactive 00:05:49.381 suites 1 1 n/a 0 0 00:05:49.381 tests 4 4 4 0 0 00:05:49.381 asserts 152 152 152 0 n/a 00:05:49.381 00:05:49.381 Elapsed time = 0.196 seconds 00:05:49.381 00:05:49.381 real 0m0.205s 00:05:49.381 user 0m0.197s 00:05:49.381 sys 0m0.007s 00:05:49.381 13:31:37 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:49.381 13:31:37 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:49.381 ************************************ 00:05:49.381 END TEST env_memory 00:05:49.381 ************************************ 00:05:49.381 13:31:37 env -- common/autotest_common.sh@1142 -- # return 0 00:05:49.381 13:31:37 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:49.381 13:31:37 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:49.381 13:31:37 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.381 13:31:37 env -- common/autotest_common.sh@10 -- # set +x 00:05:49.381 ************************************ 00:05:49.381 START TEST env_vtophys 00:05:49.381 ************************************ 00:05:49.381 13:31:37 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:49.381 EAL: lib.eal log level changed from notice to debug 00:05:49.381 EAL: Detected lcore 0 as core 0 on socket 0 00:05:49.381 EAL: Detected lcore 1 as core 1 on socket 0 00:05:49.381 EAL: Detected lcore 2 as core 2 on socket 0 00:05:49.381 EAL: Detected lcore 3 as core 3 on socket 0 00:05:49.381 EAL: Detected lcore 4 as core 4 on socket 0 00:05:49.381 EAL: Detected lcore 5 as core 8 on socket 0 00:05:49.381 EAL: Detected lcore 6 as core 9 on socket 0 00:05:49.381 EAL: Detected lcore 7 as core 10 on socket 0 00:05:49.381 EAL: Detected lcore 8 as core 11 on socket 0 00:05:49.381 EAL: Detected lcore 9 as core 16 on socket 0 00:05:49.381 EAL: Detected lcore 10 as core 17 on socket 0 00:05:49.381 EAL: Detected lcore 11 as core 18 on socket 0 00:05:49.381 EAL: Detected lcore 12 as core 19 on socket 0 00:05:49.381 EAL: Detected lcore 13 as core 20 on socket 0 00:05:49.381 EAL: Detected lcore 14 as core 24 on socket 0 00:05:49.381 EAL: Detected lcore 15 as core 25 on socket 0 00:05:49.381 EAL: Detected lcore 16 as core 26 on socket 0 00:05:49.381 EAL: Detected lcore 17 as core 27 on socket 0 00:05:49.381 EAL: Detected lcore 18 as core 0 on socket 1 00:05:49.381 EAL: Detected lcore 19 as core 1 on socket 1 00:05:49.381 EAL: Detected lcore 20 as core 2 on socket 1 00:05:49.381 EAL: Detected lcore 21 as core 3 on socket 1 00:05:49.381 EAL: Detected lcore 22 as core 4 on socket 1 00:05:49.381 EAL: Detected lcore 23 as core 8 on socket 1 00:05:49.381 EAL: Detected lcore 24 as core 9 on socket 1 00:05:49.381 EAL: Detected lcore 25 as core 10 on socket 1 00:05:49.381 EAL: Detected lcore 26 as core 11 on socket 1 00:05:49.381 EAL: Detected lcore 27 as core 16 on socket 1 00:05:49.381 EAL: Detected lcore 28 as core 17 on socket 1 00:05:49.381 EAL: Detected lcore 29 as core 18 on socket 1 00:05:49.381 EAL: Detected lcore 30 as core 19 on socket 1 00:05:49.381 EAL: Detected lcore 31 as core 20 on socket 1 00:05:49.381 EAL: Detected lcore 32 as core 24 on socket 1 00:05:49.381 EAL: Detected lcore 33 as core 25 on socket 1 00:05:49.381 EAL: Detected lcore 34 as core 26 on socket 1 00:05:49.381 EAL: Detected lcore 35 as core 27 on socket 1 00:05:49.381 EAL: Detected lcore 36 as core 0 on socket 0 00:05:49.381 EAL: Detected lcore 37 as core 1 on socket 0 00:05:49.381 EAL: Detected lcore 38 as core 2 on socket 0 00:05:49.381 EAL: Detected lcore 39 as core 3 on socket 0 00:05:49.381 EAL: Detected lcore 40 as core 4 on socket 0 00:05:49.381 EAL: Detected lcore 41 as core 8 on socket 0 00:05:49.381 EAL: Detected lcore 42 as core 9 on socket 0 00:05:49.381 EAL: Detected lcore 43 as core 10 on socket 0 00:05:49.381 EAL: Detected lcore 44 as core 11 on socket 0 00:05:49.381 EAL: Detected lcore 45 as core 16 on socket 0 00:05:49.381 EAL: Detected lcore 46 as core 17 on socket 0 00:05:49.381 EAL: Detected lcore 47 as core 18 on socket 0 00:05:49.381 EAL: Detected lcore 48 as core 19 on socket 0 00:05:49.381 EAL: Detected lcore 49 as core 20 on socket 0 00:05:49.381 EAL: Detected lcore 50 as core 24 on socket 0 00:05:49.381 EAL: Detected lcore 51 as core 25 on socket 0 00:05:49.381 EAL: Detected lcore 52 as core 26 on socket 0 00:05:49.381 EAL: Detected lcore 53 as core 27 on socket 0 00:05:49.381 EAL: Detected lcore 54 as core 0 on socket 1 00:05:49.381 EAL: Detected lcore 55 as core 1 on socket 1 00:05:49.381 EAL: Detected lcore 56 as core 2 on socket 1 00:05:49.381 EAL: Detected lcore 57 as core 3 on socket 1 00:05:49.381 EAL: Detected lcore 58 as core 4 on socket 1 00:05:49.381 EAL: Detected lcore 59 as core 8 on socket 1 00:05:49.381 EAL: Detected lcore 60 as core 9 on socket 1 00:05:49.381 EAL: Detected lcore 61 as core 10 on socket 1 00:05:49.381 EAL: Detected lcore 62 as core 11 on socket 1 00:05:49.381 EAL: Detected lcore 63 as core 16 on socket 1 00:05:49.381 EAL: Detected lcore 64 as core 17 on socket 1 00:05:49.381 EAL: Detected lcore 65 as core 18 on socket 1 00:05:49.381 EAL: Detected lcore 66 as core 19 on socket 1 00:05:49.381 EAL: Detected lcore 67 as core 20 on socket 1 00:05:49.381 EAL: Detected lcore 68 as core 24 on socket 1 00:05:49.381 EAL: Detected lcore 69 as core 25 on socket 1 00:05:49.381 EAL: Detected lcore 70 as core 26 on socket 1 00:05:49.381 EAL: Detected lcore 71 as core 27 on socket 1 00:05:49.381 EAL: Maximum logical cores by configuration: 128 00:05:49.381 EAL: Detected CPU lcores: 72 00:05:49.381 EAL: Detected NUMA nodes: 2 00:05:49.381 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:49.381 EAL: Detected shared linkage of DPDK 00:05:49.382 EAL: No shared files mode enabled, IPC will be disabled 00:05:49.641 EAL: No shared files mode enabled, IPC is disabled 00:05:49.641 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:05:49.641 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:05:49.642 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:05:49.642 EAL: Bus pci wants IOVA as 'PA' 00:05:49.642 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:49.642 EAL: Bus vdev wants IOVA as 'DC' 00:05:49.642 EAL: Selected IOVA mode 'PA' 00:05:49.642 EAL: Probing VFIO support... 00:05:49.642 EAL: IOMMU type 1 (Type 1) is supported 00:05:49.642 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:49.642 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:49.642 EAL: VFIO support initialized 00:05:49.642 EAL: Ask a virtual area of 0x2e000 bytes 00:05:49.642 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:49.642 EAL: Setting up physically contiguous memory... 00:05:49.642 EAL: Setting maximum number of open files to 524288 00:05:49.642 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:49.642 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:49.642 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:49.642 EAL: Ask a virtual area of 0x61000 bytes 00:05:49.642 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:49.642 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:49.642 EAL: Ask a virtual area of 0x400000000 bytes 00:05:49.642 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:49.642 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:49.642 EAL: Ask a virtual area of 0x61000 bytes 00:05:49.642 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:49.642 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:49.642 EAL: Ask a virtual area of 0x400000000 bytes 00:05:49.642 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:49.642 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:49.642 EAL: Ask a virtual area of 0x61000 bytes 00:05:49.642 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:49.642 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:49.642 EAL: Ask a virtual area of 0x400000000 bytes 00:05:49.642 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:49.642 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:49.642 EAL: Ask a virtual area of 0x61000 bytes 00:05:49.642 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:49.642 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:49.642 EAL: Ask a virtual area of 0x400000000 bytes 00:05:49.642 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:49.642 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:49.642 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:49.642 EAL: Ask a virtual area of 0x61000 bytes 00:05:49.642 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:49.642 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:49.642 EAL: Ask a virtual area of 0x400000000 bytes 00:05:49.642 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:49.642 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:49.642 EAL: Ask a virtual area of 0x61000 bytes 00:05:49.642 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:49.642 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:49.642 EAL: Ask a virtual area of 0x400000000 bytes 00:05:49.642 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:49.642 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:49.642 EAL: Ask a virtual area of 0x61000 bytes 00:05:49.642 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:49.642 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:49.642 EAL: Ask a virtual area of 0x400000000 bytes 00:05:49.642 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:49.642 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:49.642 EAL: Ask a virtual area of 0x61000 bytes 00:05:49.642 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:49.642 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:49.642 EAL: Ask a virtual area of 0x400000000 bytes 00:05:49.642 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:49.642 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:49.642 EAL: Hugepages will be freed exactly as allocated. 00:05:49.642 EAL: No shared files mode enabled, IPC is disabled 00:05:49.642 EAL: No shared files mode enabled, IPC is disabled 00:05:49.642 EAL: TSC frequency is ~2300000 KHz 00:05:49.642 EAL: Main lcore 0 is ready (tid=7ff3dae35b00;cpuset=[0]) 00:05:49.642 EAL: Trying to obtain current memory policy. 00:05:49.642 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.642 EAL: Restoring previous memory policy: 0 00:05:49.642 EAL: request: mp_malloc_sync 00:05:49.642 EAL: No shared files mode enabled, IPC is disabled 00:05:49.642 EAL: Heap on socket 0 was expanded by 2MB 00:05:49.642 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001000000 00:05:49.642 EAL: PCI memory mapped at 0x202001001000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001002000 00:05:49.642 EAL: PCI memory mapped at 0x202001003000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001004000 00:05:49.642 EAL: PCI memory mapped at 0x202001005000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001006000 00:05:49.642 EAL: PCI memory mapped at 0x202001007000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001008000 00:05:49.642 EAL: PCI memory mapped at 0x202001009000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x20200100a000 00:05:49.642 EAL: PCI memory mapped at 0x20200100b000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x20200100c000 00:05:49.642 EAL: PCI memory mapped at 0x20200100d000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x20200100e000 00:05:49.642 EAL: PCI memory mapped at 0x20200100f000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001010000 00:05:49.642 EAL: PCI memory mapped at 0x202001011000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001012000 00:05:49.642 EAL: PCI memory mapped at 0x202001013000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001014000 00:05:49.642 EAL: PCI memory mapped at 0x202001015000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001016000 00:05:49.642 EAL: PCI memory mapped at 0x202001017000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001018000 00:05:49.642 EAL: PCI memory mapped at 0x202001019000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x20200101a000 00:05:49.642 EAL: PCI memory mapped at 0x20200101b000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x20200101c000 00:05:49.642 EAL: PCI memory mapped at 0x20200101d000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:49.642 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x20200101e000 00:05:49.642 EAL: PCI memory mapped at 0x20200101f000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:49.642 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001020000 00:05:49.642 EAL: PCI memory mapped at 0x202001021000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:49.642 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001022000 00:05:49.642 EAL: PCI memory mapped at 0x202001023000 00:05:49.642 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:49.642 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:05:49.642 EAL: probe driver: 8086:37c9 qat 00:05:49.642 EAL: PCI memory mapped at 0x202001024000 00:05:49.643 EAL: PCI memory mapped at 0x202001025000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001026000 00:05:49.643 EAL: PCI memory mapped at 0x202001027000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001028000 00:05:49.643 EAL: PCI memory mapped at 0x202001029000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200102a000 00:05:49.643 EAL: PCI memory mapped at 0x20200102b000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200102c000 00:05:49.643 EAL: PCI memory mapped at 0x20200102d000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200102e000 00:05:49.643 EAL: PCI memory mapped at 0x20200102f000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001030000 00:05:49.643 EAL: PCI memory mapped at 0x202001031000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001032000 00:05:49.643 EAL: PCI memory mapped at 0x202001033000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001034000 00:05:49.643 EAL: PCI memory mapped at 0x202001035000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001036000 00:05:49.643 EAL: PCI memory mapped at 0x202001037000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001038000 00:05:49.643 EAL: PCI memory mapped at 0x202001039000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200103a000 00:05:49.643 EAL: PCI memory mapped at 0x20200103b000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200103c000 00:05:49.643 EAL: PCI memory mapped at 0x20200103d000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:49.643 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200103e000 00:05:49.643 EAL: PCI memory mapped at 0x20200103f000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:49.643 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001040000 00:05:49.643 EAL: PCI memory mapped at 0x202001041000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:49.643 EAL: Trying to obtain current memory policy. 00:05:49.643 EAL: Setting policy MPOL_PREFERRED for socket 1 00:05:49.643 EAL: Restoring previous memory policy: 4 00:05:49.643 EAL: request: mp_malloc_sync 00:05:49.643 EAL: No shared files mode enabled, IPC is disabled 00:05:49.643 EAL: Heap on socket 1 was expanded by 2MB 00:05:49.643 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001042000 00:05:49.643 EAL: PCI memory mapped at 0x202001043000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001044000 00:05:49.643 EAL: PCI memory mapped at 0x202001045000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001046000 00:05:49.643 EAL: PCI memory mapped at 0x202001047000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001048000 00:05:49.643 EAL: PCI memory mapped at 0x202001049000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200104a000 00:05:49.643 EAL: PCI memory mapped at 0x20200104b000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200104c000 00:05:49.643 EAL: PCI memory mapped at 0x20200104d000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200104e000 00:05:49.643 EAL: PCI memory mapped at 0x20200104f000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001050000 00:05:49.643 EAL: PCI memory mapped at 0x202001051000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001052000 00:05:49.643 EAL: PCI memory mapped at 0x202001053000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001054000 00:05:49.643 EAL: PCI memory mapped at 0x202001055000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001056000 00:05:49.643 EAL: PCI memory mapped at 0x202001057000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x202001058000 00:05:49.643 EAL: PCI memory mapped at 0x202001059000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200105a000 00:05:49.643 EAL: PCI memory mapped at 0x20200105b000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200105c000 00:05:49.643 EAL: PCI memory mapped at 0x20200105d000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:49.643 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:05:49.643 EAL: probe driver: 8086:37c9 qat 00:05:49.643 EAL: PCI memory mapped at 0x20200105e000 00:05:49.643 EAL: PCI memory mapped at 0x20200105f000 00:05:49.643 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:49.643 EAL: No shared files mode enabled, IPC is disabled 00:05:49.643 EAL: No shared files mode enabled, IPC is disabled 00:05:49.643 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:49.643 EAL: Mem event callback 'spdk:(nil)' registered 00:05:49.643 00:05:49.643 00:05:49.643 CUnit - A unit testing framework for C - Version 2.1-3 00:05:49.643 http://cunit.sourceforge.net/ 00:05:49.643 00:05:49.643 00:05:49.643 Suite: components_suite 00:05:49.643 Test: vtophys_malloc_test ...passed 00:05:49.643 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:49.643 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.643 EAL: Restoring previous memory policy: 4 00:05:49.643 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.643 EAL: request: mp_malloc_sync 00:05:49.643 EAL: No shared files mode enabled, IPC is disabled 00:05:49.643 EAL: Heap on socket 0 was expanded by 4MB 00:05:49.643 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.643 EAL: request: mp_malloc_sync 00:05:49.643 EAL: No shared files mode enabled, IPC is disabled 00:05:49.643 EAL: Heap on socket 0 was shrunk by 4MB 00:05:49.643 EAL: Trying to obtain current memory policy. 00:05:49.643 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.643 EAL: Restoring previous memory policy: 4 00:05:49.643 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.643 EAL: request: mp_malloc_sync 00:05:49.643 EAL: No shared files mode enabled, IPC is disabled 00:05:49.643 EAL: Heap on socket 0 was expanded by 6MB 00:05:49.643 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.643 EAL: request: mp_malloc_sync 00:05:49.643 EAL: No shared files mode enabled, IPC is disabled 00:05:49.643 EAL: Heap on socket 0 was shrunk by 6MB 00:05:49.643 EAL: Trying to obtain current memory policy. 00:05:49.643 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.643 EAL: Restoring previous memory policy: 4 00:05:49.643 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.643 EAL: request: mp_malloc_sync 00:05:49.643 EAL: No shared files mode enabled, IPC is disabled 00:05:49.643 EAL: Heap on socket 0 was expanded by 10MB 00:05:49.643 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.643 EAL: request: mp_malloc_sync 00:05:49.643 EAL: No shared files mode enabled, IPC is disabled 00:05:49.643 EAL: Heap on socket 0 was shrunk by 10MB 00:05:49.643 EAL: Trying to obtain current memory policy. 00:05:49.643 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.643 EAL: Restoring previous memory policy: 4 00:05:49.643 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.643 EAL: request: mp_malloc_sync 00:05:49.644 EAL: No shared files mode enabled, IPC is disabled 00:05:49.644 EAL: Heap on socket 0 was expanded by 18MB 00:05:49.644 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.644 EAL: request: mp_malloc_sync 00:05:49.644 EAL: No shared files mode enabled, IPC is disabled 00:05:49.644 EAL: Heap on socket 0 was shrunk by 18MB 00:05:49.644 EAL: Trying to obtain current memory policy. 00:05:49.644 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.644 EAL: Restoring previous memory policy: 4 00:05:49.644 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.644 EAL: request: mp_malloc_sync 00:05:49.644 EAL: No shared files mode enabled, IPC is disabled 00:05:49.644 EAL: Heap on socket 0 was expanded by 34MB 00:05:49.644 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.644 EAL: request: mp_malloc_sync 00:05:49.644 EAL: No shared files mode enabled, IPC is disabled 00:05:49.644 EAL: Heap on socket 0 was shrunk by 34MB 00:05:49.644 EAL: Trying to obtain current memory policy. 00:05:49.644 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.644 EAL: Restoring previous memory policy: 4 00:05:49.644 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.644 EAL: request: mp_malloc_sync 00:05:49.644 EAL: No shared files mode enabled, IPC is disabled 00:05:49.644 EAL: Heap on socket 0 was expanded by 66MB 00:05:49.644 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.644 EAL: request: mp_malloc_sync 00:05:49.644 EAL: No shared files mode enabled, IPC is disabled 00:05:49.644 EAL: Heap on socket 0 was shrunk by 66MB 00:05:49.644 EAL: Trying to obtain current memory policy. 00:05:49.644 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.644 EAL: Restoring previous memory policy: 4 00:05:49.644 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.644 EAL: request: mp_malloc_sync 00:05:49.644 EAL: No shared files mode enabled, IPC is disabled 00:05:49.644 EAL: Heap on socket 0 was expanded by 130MB 00:05:49.644 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.644 EAL: request: mp_malloc_sync 00:05:49.644 EAL: No shared files mode enabled, IPC is disabled 00:05:49.644 EAL: Heap on socket 0 was shrunk by 130MB 00:05:49.644 EAL: Trying to obtain current memory policy. 00:05:49.644 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.903 EAL: Restoring previous memory policy: 4 00:05:49.903 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.903 EAL: request: mp_malloc_sync 00:05:49.903 EAL: No shared files mode enabled, IPC is disabled 00:05:49.903 EAL: Heap on socket 0 was expanded by 258MB 00:05:49.903 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.903 EAL: request: mp_malloc_sync 00:05:49.903 EAL: No shared files mode enabled, IPC is disabled 00:05:49.903 EAL: Heap on socket 0 was shrunk by 258MB 00:05:49.903 EAL: Trying to obtain current memory policy. 00:05:49.903 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.903 EAL: Restoring previous memory policy: 4 00:05:49.903 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.903 EAL: request: mp_malloc_sync 00:05:49.903 EAL: No shared files mode enabled, IPC is disabled 00:05:49.903 EAL: Heap on socket 0 was expanded by 514MB 00:05:50.162 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.162 EAL: request: mp_malloc_sync 00:05:50.162 EAL: No shared files mode enabled, IPC is disabled 00:05:50.162 EAL: Heap on socket 0 was shrunk by 514MB 00:05:50.162 EAL: Trying to obtain current memory policy. 00:05:50.162 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:50.420 EAL: Restoring previous memory policy: 4 00:05:50.420 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.420 EAL: request: mp_malloc_sync 00:05:50.420 EAL: No shared files mode enabled, IPC is disabled 00:05:50.420 EAL: Heap on socket 0 was expanded by 1026MB 00:05:50.677 EAL: Calling mem event callback 'spdk:(nil)' 00:05:50.936 EAL: request: mp_malloc_sync 00:05:50.936 EAL: No shared files mode enabled, IPC is disabled 00:05:50.936 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:50.936 passed 00:05:50.936 00:05:50.936 Run Summary: Type Total Ran Passed Failed Inactive 00:05:50.936 suites 1 1 n/a 0 0 00:05:50.936 tests 2 2 2 0 0 00:05:50.936 asserts 6002 6002 6002 0 n/a 00:05:50.936 00:05:50.936 Elapsed time = 1.177 seconds 00:05:50.936 EAL: No shared files mode enabled, IPC is disabled 00:05:50.936 EAL: No shared files mode enabled, IPC is disabled 00:05:50.936 EAL: No shared files mode enabled, IPC is disabled 00:05:50.936 00:05:50.936 real 0m1.377s 00:05:50.936 user 0m0.768s 00:05:50.936 sys 0m0.575s 00:05:50.936 13:31:39 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.936 13:31:39 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:50.936 ************************************ 00:05:50.936 END TEST env_vtophys 00:05:50.936 ************************************ 00:05:50.936 13:31:39 env -- common/autotest_common.sh@1142 -- # return 0 00:05:50.937 13:31:39 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:50.937 13:31:39 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:50.937 13:31:39 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:50.937 13:31:39 env -- common/autotest_common.sh@10 -- # set +x 00:05:50.937 ************************************ 00:05:50.937 START TEST env_pci 00:05:50.937 ************************************ 00:05:50.937 13:31:39 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:50.937 00:05:50.937 00:05:50.937 CUnit - A unit testing framework for C - Version 2.1-3 00:05:50.937 http://cunit.sourceforge.net/ 00:05:50.937 00:05:50.937 00:05:50.937 Suite: pci 00:05:50.937 Test: pci_hook ...[2024-07-12 13:31:39.364438] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 384948 has claimed it 00:05:50.937 EAL: Cannot find device (10000:00:01.0) 00:05:50.937 EAL: Failed to attach device on primary process 00:05:50.937 passed 00:05:50.937 00:05:50.937 Run Summary: Type Total Ran Passed Failed Inactive 00:05:50.937 suites 1 1 n/a 0 0 00:05:50.937 tests 1 1 1 0 0 00:05:50.937 asserts 25 25 25 0 n/a 00:05:50.937 00:05:50.937 Elapsed time = 0.052 seconds 00:05:50.937 00:05:50.937 real 0m0.074s 00:05:50.937 user 0m0.023s 00:05:50.937 sys 0m0.050s 00:05:50.937 13:31:39 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.937 13:31:39 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:50.937 ************************************ 00:05:50.937 END TEST env_pci 00:05:50.937 ************************************ 00:05:50.937 13:31:39 env -- common/autotest_common.sh@1142 -- # return 0 00:05:50.937 13:31:39 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:50.937 13:31:39 env -- env/env.sh@15 -- # uname 00:05:50.937 13:31:39 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:50.937 13:31:39 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:50.937 13:31:39 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:50.937 13:31:39 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:05:50.937 13:31:39 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:50.937 13:31:39 env -- common/autotest_common.sh@10 -- # set +x 00:05:50.937 ************************************ 00:05:50.937 START TEST env_dpdk_post_init 00:05:50.937 ************************************ 00:05:50.937 13:31:39 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:51.197 EAL: Detected CPU lcores: 72 00:05:51.197 EAL: Detected NUMA nodes: 2 00:05:51.197 EAL: Detected shared linkage of DPDK 00:05:51.197 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:51.197 EAL: Selected IOVA mode 'PA' 00:05:51.197 EAL: VFIO support initialized 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:05:51.197 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:51.197 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.198 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:05:51.198 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.198 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:05:51.199 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.199 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:51.199 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:05:51.199 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.199 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:05:51.199 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.199 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:51.199 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:05:51.199 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:51.199 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:05:51.199 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:51.199 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:51.199 EAL: Using IOMMU type 1 (Type 1) 00:05:51.199 EAL: Ignore mapping IO port bar(1) 00:05:51.199 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:05:51.199 EAL: Ignore mapping IO port bar(1) 00:05:51.199 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:05:51.199 EAL: Ignore mapping IO port bar(1) 00:05:51.199 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:05:51.457 EAL: Ignore mapping IO port bar(1) 00:05:51.457 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:05:51.457 EAL: Ignore mapping IO port bar(1) 00:05:51.458 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:05:51.458 EAL: Ignore mapping IO port bar(1) 00:05:51.458 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:05:51.458 EAL: Ignore mapping IO port bar(1) 00:05:51.458 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:05:51.458 EAL: Ignore mapping IO port bar(1) 00:05:51.458 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:05:51.717 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:05:51.717 EAL: Ignore mapping IO port bar(1) 00:05:51.717 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:05:51.717 EAL: Ignore mapping IO port bar(1) 00:05:51.717 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:05:51.717 EAL: Ignore mapping IO port bar(1) 00:05:51.717 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:05:51.717 EAL: Ignore mapping IO port bar(1) 00:05:51.717 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:05:51.717 EAL: Ignore mapping IO port bar(1) 00:05:51.717 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:05:51.717 EAL: Ignore mapping IO port bar(1) 00:05:51.717 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:05:51.717 EAL: Ignore mapping IO port bar(1) 00:05:51.717 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:05:51.717 EAL: Ignore mapping IO port bar(1) 00:05:51.717 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:05:54.249 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:05:54.249 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:05:54.249 Starting DPDK initialization... 00:05:54.249 Starting SPDK post initialization... 00:05:54.249 SPDK NVMe probe 00:05:54.249 Attaching to 0000:5e:00.0 00:05:54.249 Attached to 0000:5e:00.0 00:05:54.249 Cleaning up... 00:05:54.249 00:05:54.249 real 0m3.268s 00:05:54.249 user 0m2.263s 00:05:54.249 sys 0m0.584s 00:05:54.249 13:31:42 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:54.249 13:31:42 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:54.249 ************************************ 00:05:54.249 END TEST env_dpdk_post_init 00:05:54.249 ************************************ 00:05:54.249 13:31:42 env -- common/autotest_common.sh@1142 -- # return 0 00:05:54.249 13:31:42 env -- env/env.sh@26 -- # uname 00:05:54.512 13:31:42 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:54.512 13:31:42 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:54.512 13:31:42 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:54.512 13:31:42 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:54.512 13:31:42 env -- common/autotest_common.sh@10 -- # set +x 00:05:54.512 ************************************ 00:05:54.512 START TEST env_mem_callbacks 00:05:54.512 ************************************ 00:05:54.512 13:31:42 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:54.512 EAL: Detected CPU lcores: 72 00:05:54.512 EAL: Detected NUMA nodes: 2 00:05:54.512 EAL: Detected shared linkage of DPDK 00:05:54.512 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:54.512 EAL: Selected IOVA mode 'PA' 00:05:54.512 EAL: VFIO support initialized 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.512 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:05:54.512 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:05:54.512 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.513 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:05:54.513 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.513 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:05:54.514 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.514 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:05:54.514 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:05:54.514 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.514 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:05:54.514 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.514 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:05:54.514 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:05:54.514 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:05:54.514 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:05:54.514 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:05:54.514 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:54.514 00:05:54.514 00:05:54.514 CUnit - A unit testing framework for C - Version 2.1-3 00:05:54.514 http://cunit.sourceforge.net/ 00:05:54.514 00:05:54.514 00:05:54.514 Suite: memory 00:05:54.514 Test: test ... 00:05:54.514 register 0x200000200000 2097152 00:05:54.514 register 0x201000a00000 2097152 00:05:54.514 malloc 3145728 00:05:54.514 register 0x200000400000 4194304 00:05:54.514 buf 0x200000500000 len 3145728 PASSED 00:05:54.514 malloc 64 00:05:54.514 buf 0x2000004fff40 len 64 PASSED 00:05:54.514 malloc 4194304 00:05:54.514 register 0x200000800000 6291456 00:05:54.514 buf 0x200000a00000 len 4194304 PASSED 00:05:54.514 free 0x200000500000 3145728 00:05:54.514 free 0x2000004fff40 64 00:05:54.514 unregister 0x200000400000 4194304 PASSED 00:05:54.514 free 0x200000a00000 4194304 00:05:54.514 unregister 0x200000800000 6291456 PASSED 00:05:54.514 malloc 8388608 00:05:54.514 register 0x200000400000 10485760 00:05:54.514 buf 0x200000600000 len 8388608 PASSED 00:05:54.514 free 0x200000600000 8388608 00:05:54.514 unregister 0x200000400000 10485760 PASSED 00:05:54.514 passed 00:05:54.514 00:05:54.514 Run Summary: Type Total Ran Passed Failed Inactive 00:05:54.514 suites 1 1 n/a 0 0 00:05:54.514 tests 1 1 1 0 0 00:05:54.514 asserts 16 16 16 0 n/a 00:05:54.514 00:05:54.514 Elapsed time = 0.008 seconds 00:05:54.514 00:05:54.514 real 0m0.176s 00:05:54.514 user 0m0.055s 00:05:54.514 sys 0m0.120s 00:05:54.514 13:31:43 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:54.514 13:31:43 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:54.514 ************************************ 00:05:54.514 END TEST env_mem_callbacks 00:05:54.514 ************************************ 00:05:54.514 13:31:43 env -- common/autotest_common.sh@1142 -- # return 0 00:05:54.514 00:05:54.514 real 0m5.619s 00:05:54.514 user 0m3.503s 00:05:54.514 sys 0m1.700s 00:05:54.514 13:31:43 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:54.514 13:31:43 env -- common/autotest_common.sh@10 -- # set +x 00:05:54.514 ************************************ 00:05:54.514 END TEST env 00:05:54.514 ************************************ 00:05:54.773 13:31:43 -- common/autotest_common.sh@1142 -- # return 0 00:05:54.773 13:31:43 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:54.773 13:31:43 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:54.773 13:31:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:54.773 13:31:43 -- common/autotest_common.sh@10 -- # set +x 00:05:54.773 ************************************ 00:05:54.773 START TEST rpc 00:05:54.773 ************************************ 00:05:54.773 13:31:43 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:54.773 * Looking for test storage... 00:05:54.773 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:54.773 13:31:43 rpc -- rpc/rpc.sh@65 -- # spdk_pid=385600 00:05:54.773 13:31:43 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:54.773 13:31:43 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:54.773 13:31:43 rpc -- rpc/rpc.sh@67 -- # waitforlisten 385600 00:05:54.773 13:31:43 rpc -- common/autotest_common.sh@829 -- # '[' -z 385600 ']' 00:05:54.773 13:31:43 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:54.773 13:31:43 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:54.773 13:31:43 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:54.773 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:54.773 13:31:43 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:54.773 13:31:43 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:54.773 [2024-07-12 13:31:43.352131] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:05:54.773 [2024-07-12 13:31:43.352210] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid385600 ] 00:05:55.031 [2024-07-12 13:31:43.485198] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.031 [2024-07-12 13:31:43.591144] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:55.031 [2024-07-12 13:31:43.591199] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 385600' to capture a snapshot of events at runtime. 00:05:55.031 [2024-07-12 13:31:43.591214] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:55.031 [2024-07-12 13:31:43.591228] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:55.031 [2024-07-12 13:31:43.591239] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid385600 for offline analysis/debug. 00:05:55.031 [2024-07-12 13:31:43.591277] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.968 13:31:44 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:55.968 13:31:44 rpc -- common/autotest_common.sh@862 -- # return 0 00:05:55.968 13:31:44 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:55.968 13:31:44 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:55.968 13:31:44 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:55.968 13:31:44 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:55.968 13:31:44 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:55.968 13:31:44 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:55.968 13:31:44 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:55.968 ************************************ 00:05:55.968 START TEST rpc_integrity 00:05:55.968 ************************************ 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:55.968 { 00:05:55.968 "name": "Malloc0", 00:05:55.968 "aliases": [ 00:05:55.968 "5e7e5ad3-d4bc-43e7-8112-0fafd72aa198" 00:05:55.968 ], 00:05:55.968 "product_name": "Malloc disk", 00:05:55.968 "block_size": 512, 00:05:55.968 "num_blocks": 16384, 00:05:55.968 "uuid": "5e7e5ad3-d4bc-43e7-8112-0fafd72aa198", 00:05:55.968 "assigned_rate_limits": { 00:05:55.968 "rw_ios_per_sec": 0, 00:05:55.968 "rw_mbytes_per_sec": 0, 00:05:55.968 "r_mbytes_per_sec": 0, 00:05:55.968 "w_mbytes_per_sec": 0 00:05:55.968 }, 00:05:55.968 "claimed": false, 00:05:55.968 "zoned": false, 00:05:55.968 "supported_io_types": { 00:05:55.968 "read": true, 00:05:55.968 "write": true, 00:05:55.968 "unmap": true, 00:05:55.968 "flush": true, 00:05:55.968 "reset": true, 00:05:55.968 "nvme_admin": false, 00:05:55.968 "nvme_io": false, 00:05:55.968 "nvme_io_md": false, 00:05:55.968 "write_zeroes": true, 00:05:55.968 "zcopy": true, 00:05:55.968 "get_zone_info": false, 00:05:55.968 "zone_management": false, 00:05:55.968 "zone_append": false, 00:05:55.968 "compare": false, 00:05:55.968 "compare_and_write": false, 00:05:55.968 "abort": true, 00:05:55.968 "seek_hole": false, 00:05:55.968 "seek_data": false, 00:05:55.968 "copy": true, 00:05:55.968 "nvme_iov_md": false 00:05:55.968 }, 00:05:55.968 "memory_domains": [ 00:05:55.968 { 00:05:55.968 "dma_device_id": "system", 00:05:55.968 "dma_device_type": 1 00:05:55.968 }, 00:05:55.968 { 00:05:55.968 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:55.968 "dma_device_type": 2 00:05:55.968 } 00:05:55.968 ], 00:05:55.968 "driver_specific": {} 00:05:55.968 } 00:05:55.968 ]' 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.968 [2024-07-12 13:31:44.453592] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:55.968 [2024-07-12 13:31:44.453634] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:55.968 [2024-07-12 13:31:44.453654] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1226870 00:05:55.968 [2024-07-12 13:31:44.453667] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:55.968 [2024-07-12 13:31:44.455277] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:55.968 [2024-07-12 13:31:44.455306] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:55.968 Passthru0 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.968 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:55.968 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:55.968 { 00:05:55.968 "name": "Malloc0", 00:05:55.968 "aliases": [ 00:05:55.968 "5e7e5ad3-d4bc-43e7-8112-0fafd72aa198" 00:05:55.968 ], 00:05:55.968 "product_name": "Malloc disk", 00:05:55.968 "block_size": 512, 00:05:55.968 "num_blocks": 16384, 00:05:55.968 "uuid": "5e7e5ad3-d4bc-43e7-8112-0fafd72aa198", 00:05:55.968 "assigned_rate_limits": { 00:05:55.968 "rw_ios_per_sec": 0, 00:05:55.968 "rw_mbytes_per_sec": 0, 00:05:55.968 "r_mbytes_per_sec": 0, 00:05:55.968 "w_mbytes_per_sec": 0 00:05:55.968 }, 00:05:55.968 "claimed": true, 00:05:55.968 "claim_type": "exclusive_write", 00:05:55.968 "zoned": false, 00:05:55.968 "supported_io_types": { 00:05:55.968 "read": true, 00:05:55.968 "write": true, 00:05:55.968 "unmap": true, 00:05:55.968 "flush": true, 00:05:55.968 "reset": true, 00:05:55.968 "nvme_admin": false, 00:05:55.968 "nvme_io": false, 00:05:55.968 "nvme_io_md": false, 00:05:55.968 "write_zeroes": true, 00:05:55.968 "zcopy": true, 00:05:55.968 "get_zone_info": false, 00:05:55.968 "zone_management": false, 00:05:55.968 "zone_append": false, 00:05:55.968 "compare": false, 00:05:55.968 "compare_and_write": false, 00:05:55.968 "abort": true, 00:05:55.968 "seek_hole": false, 00:05:55.969 "seek_data": false, 00:05:55.969 "copy": true, 00:05:55.969 "nvme_iov_md": false 00:05:55.969 }, 00:05:55.969 "memory_domains": [ 00:05:55.969 { 00:05:55.969 "dma_device_id": "system", 00:05:55.969 "dma_device_type": 1 00:05:55.969 }, 00:05:55.969 { 00:05:55.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:55.969 "dma_device_type": 2 00:05:55.969 } 00:05:55.969 ], 00:05:55.969 "driver_specific": {} 00:05:55.969 }, 00:05:55.969 { 00:05:55.969 "name": "Passthru0", 00:05:55.969 "aliases": [ 00:05:55.969 "74156e88-cc64-5d15-8361-9aaf993e1458" 00:05:55.969 ], 00:05:55.969 "product_name": "passthru", 00:05:55.969 "block_size": 512, 00:05:55.969 "num_blocks": 16384, 00:05:55.969 "uuid": "74156e88-cc64-5d15-8361-9aaf993e1458", 00:05:55.969 "assigned_rate_limits": { 00:05:55.969 "rw_ios_per_sec": 0, 00:05:55.969 "rw_mbytes_per_sec": 0, 00:05:55.969 "r_mbytes_per_sec": 0, 00:05:55.969 "w_mbytes_per_sec": 0 00:05:55.969 }, 00:05:55.969 "claimed": false, 00:05:55.969 "zoned": false, 00:05:55.969 "supported_io_types": { 00:05:55.969 "read": true, 00:05:55.969 "write": true, 00:05:55.969 "unmap": true, 00:05:55.969 "flush": true, 00:05:55.969 "reset": true, 00:05:55.969 "nvme_admin": false, 00:05:55.969 "nvme_io": false, 00:05:55.969 "nvme_io_md": false, 00:05:55.969 "write_zeroes": true, 00:05:55.969 "zcopy": true, 00:05:55.969 "get_zone_info": false, 00:05:55.969 "zone_management": false, 00:05:55.969 "zone_append": false, 00:05:55.969 "compare": false, 00:05:55.969 "compare_and_write": false, 00:05:55.969 "abort": true, 00:05:55.969 "seek_hole": false, 00:05:55.969 "seek_data": false, 00:05:55.969 "copy": true, 00:05:55.969 "nvme_iov_md": false 00:05:55.969 }, 00:05:55.969 "memory_domains": [ 00:05:55.969 { 00:05:55.969 "dma_device_id": "system", 00:05:55.969 "dma_device_type": 1 00:05:55.969 }, 00:05:55.969 { 00:05:55.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:55.969 "dma_device_type": 2 00:05:55.969 } 00:05:55.969 ], 00:05:55.969 "driver_specific": { 00:05:55.969 "passthru": { 00:05:55.969 "name": "Passthru0", 00:05:55.969 "base_bdev_name": "Malloc0" 00:05:55.969 } 00:05:55.969 } 00:05:55.969 } 00:05:55.969 ]' 00:05:55.969 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:55.969 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:55.969 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:55.969 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:55.969 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.969 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:55.969 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:55.969 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:55.969 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.969 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:55.969 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:55.969 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:55.969 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.227 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:56.227 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:56.227 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:56.227 13:31:44 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:56.227 00:05:56.227 real 0m0.294s 00:05:56.227 user 0m0.178s 00:05:56.227 sys 0m0.054s 00:05:56.227 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:56.227 13:31:44 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.227 ************************************ 00:05:56.227 END TEST rpc_integrity 00:05:56.227 ************************************ 00:05:56.227 13:31:44 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:56.227 13:31:44 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:56.227 13:31:44 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:56.227 13:31:44 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:56.227 13:31:44 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:56.227 ************************************ 00:05:56.227 START TEST rpc_plugins 00:05:56.227 ************************************ 00:05:56.227 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:05:56.227 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:56.227 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:56.227 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:56.227 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:56.227 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:56.227 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:56.228 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:56.228 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:56.228 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:56.228 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:56.228 { 00:05:56.228 "name": "Malloc1", 00:05:56.228 "aliases": [ 00:05:56.228 "c21ca236-6329-41da-a6c2-157bdedcef23" 00:05:56.228 ], 00:05:56.228 "product_name": "Malloc disk", 00:05:56.228 "block_size": 4096, 00:05:56.228 "num_blocks": 256, 00:05:56.228 "uuid": "c21ca236-6329-41da-a6c2-157bdedcef23", 00:05:56.228 "assigned_rate_limits": { 00:05:56.228 "rw_ios_per_sec": 0, 00:05:56.228 "rw_mbytes_per_sec": 0, 00:05:56.228 "r_mbytes_per_sec": 0, 00:05:56.228 "w_mbytes_per_sec": 0 00:05:56.228 }, 00:05:56.228 "claimed": false, 00:05:56.228 "zoned": false, 00:05:56.228 "supported_io_types": { 00:05:56.228 "read": true, 00:05:56.228 "write": true, 00:05:56.228 "unmap": true, 00:05:56.228 "flush": true, 00:05:56.228 "reset": true, 00:05:56.228 "nvme_admin": false, 00:05:56.228 "nvme_io": false, 00:05:56.228 "nvme_io_md": false, 00:05:56.228 "write_zeroes": true, 00:05:56.228 "zcopy": true, 00:05:56.228 "get_zone_info": false, 00:05:56.228 "zone_management": false, 00:05:56.228 "zone_append": false, 00:05:56.228 "compare": false, 00:05:56.228 "compare_and_write": false, 00:05:56.228 "abort": true, 00:05:56.228 "seek_hole": false, 00:05:56.228 "seek_data": false, 00:05:56.228 "copy": true, 00:05:56.228 "nvme_iov_md": false 00:05:56.228 }, 00:05:56.228 "memory_domains": [ 00:05:56.228 { 00:05:56.228 "dma_device_id": "system", 00:05:56.228 "dma_device_type": 1 00:05:56.228 }, 00:05:56.228 { 00:05:56.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:56.228 "dma_device_type": 2 00:05:56.228 } 00:05:56.228 ], 00:05:56.228 "driver_specific": {} 00:05:56.228 } 00:05:56.228 ]' 00:05:56.228 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:56.228 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:56.228 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:56.228 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:56.228 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:56.228 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:56.228 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:56.228 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:56.228 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:56.228 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:56.228 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:56.228 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:56.487 13:31:44 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:56.487 00:05:56.487 real 0m0.150s 00:05:56.487 user 0m0.090s 00:05:56.487 sys 0m0.027s 00:05:56.487 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:56.487 13:31:44 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:56.487 ************************************ 00:05:56.487 END TEST rpc_plugins 00:05:56.487 ************************************ 00:05:56.487 13:31:44 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:56.487 13:31:44 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:56.487 13:31:44 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:56.487 13:31:44 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:56.487 13:31:44 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:56.487 ************************************ 00:05:56.487 START TEST rpc_trace_cmd_test 00:05:56.487 ************************************ 00:05:56.487 13:31:44 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:05:56.487 13:31:44 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:56.487 13:31:44 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:56.487 13:31:44 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:56.487 13:31:44 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:56.487 13:31:44 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:56.487 13:31:44 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:56.487 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid385600", 00:05:56.487 "tpoint_group_mask": "0x8", 00:05:56.487 "iscsi_conn": { 00:05:56.487 "mask": "0x2", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "scsi": { 00:05:56.487 "mask": "0x4", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "bdev": { 00:05:56.487 "mask": "0x8", 00:05:56.487 "tpoint_mask": "0xffffffffffffffff" 00:05:56.487 }, 00:05:56.487 "nvmf_rdma": { 00:05:56.487 "mask": "0x10", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "nvmf_tcp": { 00:05:56.487 "mask": "0x20", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "ftl": { 00:05:56.487 "mask": "0x40", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "blobfs": { 00:05:56.487 "mask": "0x80", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "dsa": { 00:05:56.487 "mask": "0x200", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "thread": { 00:05:56.487 "mask": "0x400", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "nvme_pcie": { 00:05:56.487 "mask": "0x800", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "iaa": { 00:05:56.487 "mask": "0x1000", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "nvme_tcp": { 00:05:56.487 "mask": "0x2000", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "bdev_nvme": { 00:05:56.487 "mask": "0x4000", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 }, 00:05:56.487 "sock": { 00:05:56.487 "mask": "0x8000", 00:05:56.487 "tpoint_mask": "0x0" 00:05:56.487 } 00:05:56.487 }' 00:05:56.487 13:31:44 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:56.487 13:31:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:56.487 13:31:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:56.487 13:31:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:56.487 13:31:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:56.746 13:31:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:56.746 13:31:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:56.746 13:31:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:56.746 13:31:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:56.746 13:31:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:56.746 00:05:56.746 real 0m0.281s 00:05:56.746 user 0m0.232s 00:05:56.746 sys 0m0.040s 00:05:56.746 13:31:45 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:56.746 13:31:45 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:56.746 ************************************ 00:05:56.746 END TEST rpc_trace_cmd_test 00:05:56.746 ************************************ 00:05:56.746 13:31:45 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:56.746 13:31:45 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:56.746 13:31:45 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:56.746 13:31:45 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:56.746 13:31:45 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:56.746 13:31:45 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:56.746 13:31:45 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:56.746 ************************************ 00:05:56.746 START TEST rpc_daemon_integrity 00:05:56.746 ************************************ 00:05:56.746 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:05:56.746 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:56.746 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:56.746 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.746 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:56.746 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:56.746 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:57.006 { 00:05:57.006 "name": "Malloc2", 00:05:57.006 "aliases": [ 00:05:57.006 "f7092a05-9a6b-4a5e-a982-1af5506e864f" 00:05:57.006 ], 00:05:57.006 "product_name": "Malloc disk", 00:05:57.006 "block_size": 512, 00:05:57.006 "num_blocks": 16384, 00:05:57.006 "uuid": "f7092a05-9a6b-4a5e-a982-1af5506e864f", 00:05:57.006 "assigned_rate_limits": { 00:05:57.006 "rw_ios_per_sec": 0, 00:05:57.006 "rw_mbytes_per_sec": 0, 00:05:57.006 "r_mbytes_per_sec": 0, 00:05:57.006 "w_mbytes_per_sec": 0 00:05:57.006 }, 00:05:57.006 "claimed": false, 00:05:57.006 "zoned": false, 00:05:57.006 "supported_io_types": { 00:05:57.006 "read": true, 00:05:57.006 "write": true, 00:05:57.006 "unmap": true, 00:05:57.006 "flush": true, 00:05:57.006 "reset": true, 00:05:57.006 "nvme_admin": false, 00:05:57.006 "nvme_io": false, 00:05:57.006 "nvme_io_md": false, 00:05:57.006 "write_zeroes": true, 00:05:57.006 "zcopy": true, 00:05:57.006 "get_zone_info": false, 00:05:57.006 "zone_management": false, 00:05:57.006 "zone_append": false, 00:05:57.006 "compare": false, 00:05:57.006 "compare_and_write": false, 00:05:57.006 "abort": true, 00:05:57.006 "seek_hole": false, 00:05:57.006 "seek_data": false, 00:05:57.006 "copy": true, 00:05:57.006 "nvme_iov_md": false 00:05:57.006 }, 00:05:57.006 "memory_domains": [ 00:05:57.006 { 00:05:57.006 "dma_device_id": "system", 00:05:57.006 "dma_device_type": 1 00:05:57.006 }, 00:05:57.006 { 00:05:57.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:57.006 "dma_device_type": 2 00:05:57.006 } 00:05:57.006 ], 00:05:57.006 "driver_specific": {} 00:05:57.006 } 00:05:57.006 ]' 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:57.006 [2024-07-12 13:31:45.416348] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:57.006 [2024-07-12 13:31:45.416387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:57.006 [2024-07-12 13:31:45.416407] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1226d40 00:05:57.006 [2024-07-12 13:31:45.416420] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:57.006 [2024-07-12 13:31:45.417790] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:57.006 [2024-07-12 13:31:45.417823] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:57.006 Passthru0 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:57.006 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:57.006 { 00:05:57.006 "name": "Malloc2", 00:05:57.006 "aliases": [ 00:05:57.006 "f7092a05-9a6b-4a5e-a982-1af5506e864f" 00:05:57.006 ], 00:05:57.006 "product_name": "Malloc disk", 00:05:57.006 "block_size": 512, 00:05:57.006 "num_blocks": 16384, 00:05:57.006 "uuid": "f7092a05-9a6b-4a5e-a982-1af5506e864f", 00:05:57.006 "assigned_rate_limits": { 00:05:57.006 "rw_ios_per_sec": 0, 00:05:57.006 "rw_mbytes_per_sec": 0, 00:05:57.006 "r_mbytes_per_sec": 0, 00:05:57.006 "w_mbytes_per_sec": 0 00:05:57.006 }, 00:05:57.006 "claimed": true, 00:05:57.006 "claim_type": "exclusive_write", 00:05:57.006 "zoned": false, 00:05:57.006 "supported_io_types": { 00:05:57.006 "read": true, 00:05:57.006 "write": true, 00:05:57.006 "unmap": true, 00:05:57.006 "flush": true, 00:05:57.006 "reset": true, 00:05:57.006 "nvme_admin": false, 00:05:57.006 "nvme_io": false, 00:05:57.006 "nvme_io_md": false, 00:05:57.006 "write_zeroes": true, 00:05:57.006 "zcopy": true, 00:05:57.006 "get_zone_info": false, 00:05:57.006 "zone_management": false, 00:05:57.006 "zone_append": false, 00:05:57.006 "compare": false, 00:05:57.006 "compare_and_write": false, 00:05:57.006 "abort": true, 00:05:57.006 "seek_hole": false, 00:05:57.006 "seek_data": false, 00:05:57.006 "copy": true, 00:05:57.006 "nvme_iov_md": false 00:05:57.006 }, 00:05:57.006 "memory_domains": [ 00:05:57.006 { 00:05:57.006 "dma_device_id": "system", 00:05:57.006 "dma_device_type": 1 00:05:57.006 }, 00:05:57.006 { 00:05:57.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:57.006 "dma_device_type": 2 00:05:57.006 } 00:05:57.006 ], 00:05:57.006 "driver_specific": {} 00:05:57.006 }, 00:05:57.007 { 00:05:57.007 "name": "Passthru0", 00:05:57.007 "aliases": [ 00:05:57.007 "408415b0-8fdb-54d0-a813-92d7ed92ddbb" 00:05:57.007 ], 00:05:57.007 "product_name": "passthru", 00:05:57.007 "block_size": 512, 00:05:57.007 "num_blocks": 16384, 00:05:57.007 "uuid": "408415b0-8fdb-54d0-a813-92d7ed92ddbb", 00:05:57.007 "assigned_rate_limits": { 00:05:57.007 "rw_ios_per_sec": 0, 00:05:57.007 "rw_mbytes_per_sec": 0, 00:05:57.007 "r_mbytes_per_sec": 0, 00:05:57.007 "w_mbytes_per_sec": 0 00:05:57.007 }, 00:05:57.007 "claimed": false, 00:05:57.007 "zoned": false, 00:05:57.007 "supported_io_types": { 00:05:57.007 "read": true, 00:05:57.007 "write": true, 00:05:57.007 "unmap": true, 00:05:57.007 "flush": true, 00:05:57.007 "reset": true, 00:05:57.007 "nvme_admin": false, 00:05:57.007 "nvme_io": false, 00:05:57.007 "nvme_io_md": false, 00:05:57.007 "write_zeroes": true, 00:05:57.007 "zcopy": true, 00:05:57.007 "get_zone_info": false, 00:05:57.007 "zone_management": false, 00:05:57.007 "zone_append": false, 00:05:57.007 "compare": false, 00:05:57.007 "compare_and_write": false, 00:05:57.007 "abort": true, 00:05:57.007 "seek_hole": false, 00:05:57.007 "seek_data": false, 00:05:57.007 "copy": true, 00:05:57.007 "nvme_iov_md": false 00:05:57.007 }, 00:05:57.007 "memory_domains": [ 00:05:57.007 { 00:05:57.007 "dma_device_id": "system", 00:05:57.007 "dma_device_type": 1 00:05:57.007 }, 00:05:57.007 { 00:05:57.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:57.007 "dma_device_type": 2 00:05:57.007 } 00:05:57.007 ], 00:05:57.007 "driver_specific": { 00:05:57.007 "passthru": { 00:05:57.007 "name": "Passthru0", 00:05:57.007 "base_bdev_name": "Malloc2" 00:05:57.007 } 00:05:57.007 } 00:05:57.007 } 00:05:57.007 ]' 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:57.007 00:05:57.007 real 0m0.307s 00:05:57.007 user 0m0.189s 00:05:57.007 sys 0m0.057s 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:57.007 13:31:45 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:57.007 ************************************ 00:05:57.007 END TEST rpc_daemon_integrity 00:05:57.007 ************************************ 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@1142 -- # return 0 00:05:57.266 13:31:45 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:57.266 13:31:45 rpc -- rpc/rpc.sh@84 -- # killprocess 385600 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@948 -- # '[' -z 385600 ']' 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@952 -- # kill -0 385600 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@953 -- # uname 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 385600 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 385600' 00:05:57.266 killing process with pid 385600 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@967 -- # kill 385600 00:05:57.266 13:31:45 rpc -- common/autotest_common.sh@972 -- # wait 385600 00:05:57.524 00:05:57.524 real 0m2.890s 00:05:57.525 user 0m3.637s 00:05:57.525 sys 0m0.967s 00:05:57.525 13:31:46 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:57.525 13:31:46 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:57.525 ************************************ 00:05:57.525 END TEST rpc 00:05:57.525 ************************************ 00:05:57.525 13:31:46 -- common/autotest_common.sh@1142 -- # return 0 00:05:57.525 13:31:46 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:57.525 13:31:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:57.525 13:31:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:57.525 13:31:46 -- common/autotest_common.sh@10 -- # set +x 00:05:57.784 ************************************ 00:05:57.784 START TEST skip_rpc 00:05:57.784 ************************************ 00:05:57.784 13:31:46 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:57.784 * Looking for test storage... 00:05:57.784 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:57.784 13:31:46 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:57.784 13:31:46 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:57.784 13:31:46 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:57.784 13:31:46 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:57.784 13:31:46 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:57.784 13:31:46 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:57.784 ************************************ 00:05:57.784 START TEST skip_rpc 00:05:57.784 ************************************ 00:05:57.784 13:31:46 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:05:57.784 13:31:46 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=386195 00:05:57.784 13:31:46 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:57.784 13:31:46 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:57.784 13:31:46 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:58.063 [2024-07-12 13:31:46.371839] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:05:58.063 [2024-07-12 13:31:46.371924] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid386195 ] 00:05:58.063 [2024-07-12 13:31:46.515280] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:58.063 [2024-07-12 13:31:46.612574] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:06:03.335 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 386195 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 386195 ']' 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 386195 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 386195 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 386195' 00:06:03.336 killing process with pid 386195 00:06:03.336 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 386195 00:06:03.337 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 386195 00:06:03.337 00:06:03.337 real 0m5.425s 00:06:03.337 user 0m5.077s 00:06:03.337 sys 0m0.366s 00:06:03.337 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:03.337 13:31:51 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.337 ************************************ 00:06:03.337 END TEST skip_rpc 00:06:03.337 ************************************ 00:06:03.337 13:31:51 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:03.337 13:31:51 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:03.337 13:31:51 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:03.337 13:31:51 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:03.337 13:31:51 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.337 ************************************ 00:06:03.338 START TEST skip_rpc_with_json 00:06:03.338 ************************************ 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=386981 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 386981 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 386981 ']' 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:03.338 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:03.338 13:31:51 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:03.338 [2024-07-12 13:31:51.880139] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:03.338 [2024-07-12 13:31:51.880229] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid386981 ] 00:06:03.599 [2024-07-12 13:31:52.023949] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.599 [2024-07-12 13:31:52.129755] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:04.534 [2024-07-12 13:31:52.801036] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:04.534 request: 00:06:04.534 { 00:06:04.534 "trtype": "tcp", 00:06:04.534 "method": "nvmf_get_transports", 00:06:04.534 "req_id": 1 00:06:04.534 } 00:06:04.534 Got JSON-RPC error response 00:06:04.534 response: 00:06:04.534 { 00:06:04.534 "code": -19, 00:06:04.534 "message": "No such device" 00:06:04.534 } 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:04.534 [2024-07-12 13:31:52.813175] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:04.534 13:31:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:04.534 { 00:06:04.534 "subsystems": [ 00:06:04.534 { 00:06:04.534 "subsystem": "keyring", 00:06:04.534 "config": [] 00:06:04.534 }, 00:06:04.534 { 00:06:04.534 "subsystem": "iobuf", 00:06:04.534 "config": [ 00:06:04.534 { 00:06:04.534 "method": "iobuf_set_options", 00:06:04.534 "params": { 00:06:04.534 "small_pool_count": 8192, 00:06:04.534 "large_pool_count": 1024, 00:06:04.534 "small_bufsize": 8192, 00:06:04.534 "large_bufsize": 135168 00:06:04.535 } 00:06:04.535 } 00:06:04.535 ] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "sock", 00:06:04.535 "config": [ 00:06:04.535 { 00:06:04.535 "method": "sock_set_default_impl", 00:06:04.535 "params": { 00:06:04.535 "impl_name": "posix" 00:06:04.535 } 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "method": "sock_impl_set_options", 00:06:04.535 "params": { 00:06:04.535 "impl_name": "ssl", 00:06:04.535 "recv_buf_size": 4096, 00:06:04.535 "send_buf_size": 4096, 00:06:04.535 "enable_recv_pipe": true, 00:06:04.535 "enable_quickack": false, 00:06:04.535 "enable_placement_id": 0, 00:06:04.535 "enable_zerocopy_send_server": true, 00:06:04.535 "enable_zerocopy_send_client": false, 00:06:04.535 "zerocopy_threshold": 0, 00:06:04.535 "tls_version": 0, 00:06:04.535 "enable_ktls": false 00:06:04.535 } 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "method": "sock_impl_set_options", 00:06:04.535 "params": { 00:06:04.535 "impl_name": "posix", 00:06:04.535 "recv_buf_size": 2097152, 00:06:04.535 "send_buf_size": 2097152, 00:06:04.535 "enable_recv_pipe": true, 00:06:04.535 "enable_quickack": false, 00:06:04.535 "enable_placement_id": 0, 00:06:04.535 "enable_zerocopy_send_server": true, 00:06:04.535 "enable_zerocopy_send_client": false, 00:06:04.535 "zerocopy_threshold": 0, 00:06:04.535 "tls_version": 0, 00:06:04.535 "enable_ktls": false 00:06:04.535 } 00:06:04.535 } 00:06:04.535 ] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "vmd", 00:06:04.535 "config": [] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "accel", 00:06:04.535 "config": [ 00:06:04.535 { 00:06:04.535 "method": "accel_set_options", 00:06:04.535 "params": { 00:06:04.535 "small_cache_size": 128, 00:06:04.535 "large_cache_size": 16, 00:06:04.535 "task_count": 2048, 00:06:04.535 "sequence_count": 2048, 00:06:04.535 "buf_count": 2048 00:06:04.535 } 00:06:04.535 } 00:06:04.535 ] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "bdev", 00:06:04.535 "config": [ 00:06:04.535 { 00:06:04.535 "method": "bdev_set_options", 00:06:04.535 "params": { 00:06:04.535 "bdev_io_pool_size": 65535, 00:06:04.535 "bdev_io_cache_size": 256, 00:06:04.535 "bdev_auto_examine": true, 00:06:04.535 "iobuf_small_cache_size": 128, 00:06:04.535 "iobuf_large_cache_size": 16 00:06:04.535 } 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "method": "bdev_raid_set_options", 00:06:04.535 "params": { 00:06:04.535 "process_window_size_kb": 1024 00:06:04.535 } 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "method": "bdev_iscsi_set_options", 00:06:04.535 "params": { 00:06:04.535 "timeout_sec": 30 00:06:04.535 } 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "method": "bdev_nvme_set_options", 00:06:04.535 "params": { 00:06:04.535 "action_on_timeout": "none", 00:06:04.535 "timeout_us": 0, 00:06:04.535 "timeout_admin_us": 0, 00:06:04.535 "keep_alive_timeout_ms": 10000, 00:06:04.535 "arbitration_burst": 0, 00:06:04.535 "low_priority_weight": 0, 00:06:04.535 "medium_priority_weight": 0, 00:06:04.535 "high_priority_weight": 0, 00:06:04.535 "nvme_adminq_poll_period_us": 10000, 00:06:04.535 "nvme_ioq_poll_period_us": 0, 00:06:04.535 "io_queue_requests": 0, 00:06:04.535 "delay_cmd_submit": true, 00:06:04.535 "transport_retry_count": 4, 00:06:04.535 "bdev_retry_count": 3, 00:06:04.535 "transport_ack_timeout": 0, 00:06:04.535 "ctrlr_loss_timeout_sec": 0, 00:06:04.535 "reconnect_delay_sec": 0, 00:06:04.535 "fast_io_fail_timeout_sec": 0, 00:06:04.535 "disable_auto_failback": false, 00:06:04.535 "generate_uuids": false, 00:06:04.535 "transport_tos": 0, 00:06:04.535 "nvme_error_stat": false, 00:06:04.535 "rdma_srq_size": 0, 00:06:04.535 "io_path_stat": false, 00:06:04.535 "allow_accel_sequence": false, 00:06:04.535 "rdma_max_cq_size": 0, 00:06:04.535 "rdma_cm_event_timeout_ms": 0, 00:06:04.535 "dhchap_digests": [ 00:06:04.535 "sha256", 00:06:04.535 "sha384", 00:06:04.535 "sha512" 00:06:04.535 ], 00:06:04.535 "dhchap_dhgroups": [ 00:06:04.535 "null", 00:06:04.535 "ffdhe2048", 00:06:04.535 "ffdhe3072", 00:06:04.535 "ffdhe4096", 00:06:04.535 "ffdhe6144", 00:06:04.535 "ffdhe8192" 00:06:04.535 ] 00:06:04.535 } 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "method": "bdev_nvme_set_hotplug", 00:06:04.535 "params": { 00:06:04.535 "period_us": 100000, 00:06:04.535 "enable": false 00:06:04.535 } 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "method": "bdev_wait_for_examine" 00:06:04.535 } 00:06:04.535 ] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "scsi", 00:06:04.535 "config": null 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "scheduler", 00:06:04.535 "config": [ 00:06:04.535 { 00:06:04.535 "method": "framework_set_scheduler", 00:06:04.535 "params": { 00:06:04.535 "name": "static" 00:06:04.535 } 00:06:04.535 } 00:06:04.535 ] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "vhost_scsi", 00:06:04.535 "config": [] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "vhost_blk", 00:06:04.535 "config": [] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "ublk", 00:06:04.535 "config": [] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "nbd", 00:06:04.535 "config": [] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "nvmf", 00:06:04.535 "config": [ 00:06:04.535 { 00:06:04.535 "method": "nvmf_set_config", 00:06:04.535 "params": { 00:06:04.535 "discovery_filter": "match_any", 00:06:04.535 "admin_cmd_passthru": { 00:06:04.535 "identify_ctrlr": false 00:06:04.535 } 00:06:04.535 } 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "method": "nvmf_set_max_subsystems", 00:06:04.535 "params": { 00:06:04.535 "max_subsystems": 1024 00:06:04.535 } 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "method": "nvmf_set_crdt", 00:06:04.535 "params": { 00:06:04.535 "crdt1": 0, 00:06:04.535 "crdt2": 0, 00:06:04.535 "crdt3": 0 00:06:04.535 } 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "method": "nvmf_create_transport", 00:06:04.535 "params": { 00:06:04.535 "trtype": "TCP", 00:06:04.535 "max_queue_depth": 128, 00:06:04.535 "max_io_qpairs_per_ctrlr": 127, 00:06:04.535 "in_capsule_data_size": 4096, 00:06:04.535 "max_io_size": 131072, 00:06:04.535 "io_unit_size": 131072, 00:06:04.535 "max_aq_depth": 128, 00:06:04.535 "num_shared_buffers": 511, 00:06:04.535 "buf_cache_size": 4294967295, 00:06:04.535 "dif_insert_or_strip": false, 00:06:04.535 "zcopy": false, 00:06:04.535 "c2h_success": true, 00:06:04.535 "sock_priority": 0, 00:06:04.535 "abort_timeout_sec": 1, 00:06:04.535 "ack_timeout": 0, 00:06:04.535 "data_wr_pool_size": 0 00:06:04.535 } 00:06:04.535 } 00:06:04.535 ] 00:06:04.535 }, 00:06:04.535 { 00:06:04.535 "subsystem": "iscsi", 00:06:04.535 "config": [ 00:06:04.535 { 00:06:04.535 "method": "iscsi_set_options", 00:06:04.535 "params": { 00:06:04.535 "node_base": "iqn.2016-06.io.spdk", 00:06:04.535 "max_sessions": 128, 00:06:04.535 "max_connections_per_session": 2, 00:06:04.535 "max_queue_depth": 64, 00:06:04.535 "default_time2wait": 2, 00:06:04.535 "default_time2retain": 20, 00:06:04.535 "first_burst_length": 8192, 00:06:04.535 "immediate_data": true, 00:06:04.535 "allow_duplicated_isid": false, 00:06:04.535 "error_recovery_level": 0, 00:06:04.535 "nop_timeout": 60, 00:06:04.535 "nop_in_interval": 30, 00:06:04.535 "disable_chap": false, 00:06:04.535 "require_chap": false, 00:06:04.535 "mutual_chap": false, 00:06:04.535 "chap_group": 0, 00:06:04.535 "max_large_datain_per_connection": 64, 00:06:04.535 "max_r2t_per_connection": 4, 00:06:04.535 "pdu_pool_size": 36864, 00:06:04.535 "immediate_data_pool_size": 16384, 00:06:04.535 "data_out_pool_size": 2048 00:06:04.535 } 00:06:04.535 } 00:06:04.535 ] 00:06:04.535 } 00:06:04.535 ] 00:06:04.535 } 00:06:04.535 13:31:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:04.535 13:31:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 386981 00:06:04.535 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 386981 ']' 00:06:04.535 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 386981 00:06:04.535 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:04.535 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:04.535 13:31:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 386981 00:06:04.535 13:31:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:04.535 13:31:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:04.535 13:31:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 386981' 00:06:04.535 killing process with pid 386981 00:06:04.535 13:31:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 386981 00:06:04.535 13:31:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 386981 00:06:05.101 13:31:53 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=387205 00:06:05.101 13:31:53 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:05.101 13:31:53 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 387205 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 387205 ']' 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 387205 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 387205 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 387205' 00:06:10.370 killing process with pid 387205 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 387205 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 387205 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:10.370 00:06:10.370 real 0m7.058s 00:06:10.370 user 0m6.756s 00:06:10.370 sys 0m0.866s 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:10.370 ************************************ 00:06:10.370 END TEST skip_rpc_with_json 00:06:10.370 ************************************ 00:06:10.370 13:31:58 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:10.370 13:31:58 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:10.370 13:31:58 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:10.370 13:31:58 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:10.370 13:31:58 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:10.370 ************************************ 00:06:10.370 START TEST skip_rpc_with_delay 00:06:10.370 ************************************ 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:10.370 13:31:58 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:10.629 [2024-07-12 13:31:59.012472] app.c: 831:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:10.629 [2024-07-12 13:31:59.012565] app.c: 710:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:10.629 13:31:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:10.629 13:31:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:10.629 13:31:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:10.629 13:31:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:10.629 00:06:10.629 real 0m0.093s 00:06:10.629 user 0m0.051s 00:06:10.629 sys 0m0.042s 00:06:10.629 13:31:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:10.629 13:31:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:10.629 ************************************ 00:06:10.629 END TEST skip_rpc_with_delay 00:06:10.629 ************************************ 00:06:10.629 13:31:59 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:10.629 13:31:59 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:10.629 13:31:59 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:10.629 13:31:59 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:10.629 13:31:59 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:10.629 13:31:59 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:10.629 13:31:59 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:10.629 ************************************ 00:06:10.629 START TEST exit_on_failed_rpc_init 00:06:10.629 ************************************ 00:06:10.629 13:31:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:06:10.630 13:31:59 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=387960 00:06:10.630 13:31:59 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 387960 00:06:10.630 13:31:59 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:10.630 13:31:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 387960 ']' 00:06:10.630 13:31:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:10.630 13:31:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:10.630 13:31:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:10.630 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:10.630 13:31:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:10.630 13:31:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:10.630 [2024-07-12 13:31:59.190638] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:10.630 [2024-07-12 13:31:59.190709] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid387960 ] 00:06:10.889 [2024-07-12 13:31:59.318573] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.889 [2024-07-12 13:31:59.425256] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:11.825 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:11.825 [2024-07-12 13:32:00.195307] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:11.825 [2024-07-12 13:32:00.195382] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid388139 ] 00:06:11.825 [2024-07-12 13:32:00.332382] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.084 [2024-07-12 13:32:00.443785] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:12.084 [2024-07-12 13:32:00.443888] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:12.084 [2024-07-12 13:32:00.443910] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:12.084 [2024-07-12 13:32:00.443931] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 387960 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 387960 ']' 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 387960 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 387960 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 387960' 00:06:12.084 killing process with pid 387960 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 387960 00:06:12.084 13:32:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 387960 00:06:12.653 00:06:12.653 real 0m1.887s 00:06:12.653 user 0m2.181s 00:06:12.653 sys 0m0.653s 00:06:12.653 13:32:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:12.653 13:32:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:12.653 ************************************ 00:06:12.653 END TEST exit_on_failed_rpc_init 00:06:12.653 ************************************ 00:06:12.653 13:32:01 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:12.653 13:32:01 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:12.653 00:06:12.653 real 0m14.906s 00:06:12.653 user 0m14.228s 00:06:12.653 sys 0m2.241s 00:06:12.653 13:32:01 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:12.653 13:32:01 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:12.653 ************************************ 00:06:12.653 END TEST skip_rpc 00:06:12.653 ************************************ 00:06:12.653 13:32:01 -- common/autotest_common.sh@1142 -- # return 0 00:06:12.653 13:32:01 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:12.653 13:32:01 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:12.653 13:32:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.653 13:32:01 -- common/autotest_common.sh@10 -- # set +x 00:06:12.653 ************************************ 00:06:12.653 START TEST rpc_client 00:06:12.653 ************************************ 00:06:12.653 13:32:01 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:12.653 * Looking for test storage... 00:06:12.912 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:12.912 13:32:01 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:12.912 OK 00:06:12.912 13:32:01 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:12.912 00:06:12.912 real 0m0.139s 00:06:12.912 user 0m0.057s 00:06:12.912 sys 0m0.092s 00:06:12.912 13:32:01 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:12.912 13:32:01 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:12.912 ************************************ 00:06:12.912 END TEST rpc_client 00:06:12.912 ************************************ 00:06:12.912 13:32:01 -- common/autotest_common.sh@1142 -- # return 0 00:06:12.912 13:32:01 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:12.912 13:32:01 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:12.912 13:32:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.912 13:32:01 -- common/autotest_common.sh@10 -- # set +x 00:06:12.912 ************************************ 00:06:12.912 START TEST json_config 00:06:12.912 ************************************ 00:06:12.912 13:32:01 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:12.912 13:32:01 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:12.912 13:32:01 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:12.912 13:32:01 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:12.912 13:32:01 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:12.912 13:32:01 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:12.912 13:32:01 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:12.912 13:32:01 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:12.912 13:32:01 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:12.912 13:32:01 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:12.912 13:32:01 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:12.913 13:32:01 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:12.913 13:32:01 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:12.913 13:32:01 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:12.913 13:32:01 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.913 13:32:01 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.913 13:32:01 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.913 13:32:01 json_config -- paths/export.sh@5 -- # export PATH 00:06:12.913 13:32:01 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@47 -- # : 0 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:12.913 13:32:01 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:12.913 INFO: JSON configuration test init 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:12.913 13:32:01 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:12.913 13:32:01 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:12.913 13:32:01 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:12.913 13:32:01 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:12.913 13:32:01 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:12.913 13:32:01 json_config -- json_config/common.sh@9 -- # local app=target 00:06:12.913 13:32:01 json_config -- json_config/common.sh@10 -- # shift 00:06:12.913 13:32:01 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:12.913 13:32:01 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:12.913 13:32:01 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:12.913 13:32:01 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:12.913 13:32:01 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:12.913 13:32:01 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=388431 00:06:12.913 13:32:01 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:12.913 Waiting for target to run... 00:06:12.913 13:32:01 json_config -- json_config/common.sh@25 -- # waitforlisten 388431 /var/tmp/spdk_tgt.sock 00:06:12.913 13:32:01 json_config -- common/autotest_common.sh@829 -- # '[' -z 388431 ']' 00:06:12.913 13:32:01 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:12.913 13:32:01 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:12.913 13:32:01 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:12.913 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:12.913 13:32:01 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:12.913 13:32:01 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:12.913 13:32:01 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:13.172 [2024-07-12 13:32:01.526785] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:13.172 [2024-07-12 13:32:01.526867] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid388431 ] 00:06:13.739 [2024-07-12 13:32:02.127070] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.739 [2024-07-12 13:32:02.238190] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.998 13:32:02 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:13.998 13:32:02 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:13.998 13:32:02 json_config -- json_config/common.sh@26 -- # echo '' 00:06:13.998 00:06:13.998 13:32:02 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:13.998 13:32:02 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:13.998 13:32:02 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:13.998 13:32:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:13.998 13:32:02 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:13.998 13:32:02 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:13.998 13:32:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:14.256 13:32:02 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:14.256 13:32:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:14.257 [2024-07-12 13:32:02.743912] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:14.257 13:32:02 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:14.257 13:32:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:14.515 [2024-07-12 13:32:02.916347] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:14.515 13:32:02 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:14.515 13:32:02 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:14.515 13:32:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:14.515 13:32:02 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:14.515 13:32:02 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:14.515 13:32:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:14.773 [2024-07-12 13:32:03.233845] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:17.304 13:32:05 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:17.304 13:32:05 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:17.304 13:32:05 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:17.304 13:32:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:17.304 13:32:05 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:17.304 13:32:05 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:17.304 13:32:05 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:17.304 13:32:05 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:17.304 13:32:05 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:17.304 13:32:05 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:17.580 13:32:06 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:17.580 13:32:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:17.580 13:32:06 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:17.580 13:32:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:17.580 13:32:06 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:17.580 13:32:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:17.840 13:32:06 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:17.840 13:32:06 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:17.840 13:32:06 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:17.840 13:32:06 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:17.840 13:32:06 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:17.840 13:32:06 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:17.840 13:32:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:18.408 Nvme0n1p0 Nvme0n1p1 00:06:18.408 13:32:06 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:18.408 13:32:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:18.667 [2024-07-12 13:32:07.123600] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:18.667 [2024-07-12 13:32:07.123652] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:18.667 00:06:18.667 13:32:07 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:18.667 13:32:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:18.925 Malloc3 00:06:18.925 13:32:07 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:18.925 13:32:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:19.184 [2024-07-12 13:32:07.621039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:19.184 [2024-07-12 13:32:07.621092] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:19.184 [2024-07-12 13:32:07.621116] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2120340 00:06:19.184 [2024-07-12 13:32:07.621128] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:19.184 [2024-07-12 13:32:07.622773] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:19.184 [2024-07-12 13:32:07.622803] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:19.184 PTBdevFromMalloc3 00:06:19.184 13:32:07 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:19.184 13:32:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:19.443 Null0 00:06:19.443 13:32:07 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:19.443 13:32:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:19.702 Malloc0 00:06:19.702 13:32:08 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:19.702 13:32:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:19.960 Malloc1 00:06:19.960 13:32:08 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:19.960 13:32:08 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:20.219 102400+0 records in 00:06:20.219 102400+0 records out 00:06:20.219 104857600 bytes (105 MB, 100 MiB) copied, 0.302918 s, 346 MB/s 00:06:20.219 13:32:08 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:20.219 13:32:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:20.479 aio_disk 00:06:20.479 13:32:08 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:20.479 13:32:08 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:20.479 13:32:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:25.746 e7f1c5da-0711-4049-8782-d154e3421dfa 00:06:25.746 13:32:13 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:25.746 13:32:13 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:25.746 13:32:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:25.747 13:32:13 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:25.747 13:32:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:25.747 13:32:14 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:25.747 13:32:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:26.005 13:32:14 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:26.005 13:32:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:26.005 13:32:14 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:26.005 13:32:14 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:26.005 13:32:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:26.572 MallocForCryptoBdev 00:06:26.572 13:32:15 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:26.572 13:32:15 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:26.829 13:32:15 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:26.829 13:32:15 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:26.829 13:32:15 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:26.829 13:32:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:26.829 [2024-07-12 13:32:15.392987] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:26.829 CryptoMallocBdev 00:06:26.829 13:32:15 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:0fbb4f13-c435-435c-b22c-293b8ff4c5b7 bdev_register:7431b960-b7f0-4224-885e-4f539adeff0d bdev_register:674fd5ed-5d00-4f8e-8a7c-fd2724de7775 bdev_register:c3e1cb25-d47a-4224-81e8-a303bb52bef4 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:0fbb4f13-c435-435c-b22c-293b8ff4c5b7 bdev_register:7431b960-b7f0-4224-885e-4f539adeff0d bdev_register:674fd5ed-5d00-4f8e-8a7c-fd2724de7775 bdev_register:c3e1cb25-d47a-4224-81e8-a303bb52bef4 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@71 -- # sort 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@72 -- # sort 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:27.087 13:32:15 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:27.087 13:32:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:0fbb4f13-c435-435c-b22c-293b8ff4c5b7 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:7431b960-b7f0-4224-885e-4f539adeff0d 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:674fd5ed-5d00-4f8e-8a7c-fd2724de7775 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:c3e1cb25-d47a-4224-81e8-a303bb52bef4 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:0fbb4f13-c435-435c-b22c-293b8ff4c5b7 bdev_register:674fd5ed-5d00-4f8e-8a7c-fd2724de7775 bdev_register:7431b960-b7f0-4224-885e-4f539adeff0d bdev_register:aio_disk bdev_register:c3e1cb25-d47a-4224-81e8-a303bb52bef4 bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\f\b\b\4\f\1\3\-\c\4\3\5\-\4\3\5\c\-\b\2\2\c\-\2\9\3\b\8\f\f\4\c\5\b\7\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\6\7\4\f\d\5\e\d\-\5\d\0\0\-\4\f\8\e\-\8\a\7\c\-\f\d\2\7\2\4\d\e\7\7\7\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\7\4\3\1\b\9\6\0\-\b\7\f\0\-\4\2\2\4\-\8\8\5\e\-\4\f\5\3\9\a\d\e\f\f\0\d\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\c\3\e\1\c\b\2\5\-\d\4\7\a\-\4\2\2\4\-\8\1\e\8\-\a\3\0\3\b\b\5\2\b\e\f\4\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@86 -- # cat 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:0fbb4f13-c435-435c-b22c-293b8ff4c5b7 bdev_register:674fd5ed-5d00-4f8e-8a7c-fd2724de7775 bdev_register:7431b960-b7f0-4224-885e-4f539adeff0d bdev_register:aio_disk bdev_register:c3e1cb25-d47a-4224-81e8-a303bb52bef4 bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:27.347 Expected events matched: 00:06:27.347 bdev_register:0fbb4f13-c435-435c-b22c-293b8ff4c5b7 00:06:27.347 bdev_register:674fd5ed-5d00-4f8e-8a7c-fd2724de7775 00:06:27.347 bdev_register:7431b960-b7f0-4224-885e-4f539adeff0d 00:06:27.347 bdev_register:aio_disk 00:06:27.347 bdev_register:c3e1cb25-d47a-4224-81e8-a303bb52bef4 00:06:27.347 bdev_register:CryptoMallocBdev 00:06:27.347 bdev_register:Malloc0 00:06:27.347 bdev_register:Malloc0p0 00:06:27.347 bdev_register:Malloc0p1 00:06:27.347 bdev_register:Malloc0p2 00:06:27.347 bdev_register:Malloc1 00:06:27.347 bdev_register:Malloc3 00:06:27.347 bdev_register:MallocForCryptoBdev 00:06:27.347 bdev_register:Null0 00:06:27.347 bdev_register:Nvme0n1 00:06:27.347 bdev_register:Nvme0n1p0 00:06:27.347 bdev_register:Nvme0n1p1 00:06:27.347 bdev_register:PTBdevFromMalloc3 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:27.347 13:32:15 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:27.347 13:32:15 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:27.347 13:32:15 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:27.347 13:32:15 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:27.347 13:32:15 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:27.347 13:32:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:27.605 MallocBdevForConfigChangeCheck 00:06:27.605 13:32:16 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:27.605 13:32:16 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:27.605 13:32:16 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:27.605 13:32:16 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:27.605 13:32:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:28.173 13:32:16 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:28.173 INFO: shutting down applications... 00:06:28.173 13:32:16 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:28.173 13:32:16 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:28.173 13:32:16 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:28.173 13:32:16 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:28.173 [2024-07-12 13:32:16.657065] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:31.456 Calling clear_iscsi_subsystem 00:06:31.456 Calling clear_nvmf_subsystem 00:06:31.456 Calling clear_nbd_subsystem 00:06:31.456 Calling clear_ublk_subsystem 00:06:31.456 Calling clear_vhost_blk_subsystem 00:06:31.456 Calling clear_vhost_scsi_subsystem 00:06:31.456 Calling clear_bdev_subsystem 00:06:31.456 13:32:19 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:31.456 13:32:19 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:31.456 13:32:19 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:31.456 13:32:19 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:31.456 13:32:19 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:31.456 13:32:19 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:31.456 13:32:19 json_config -- json_config/json_config.sh@345 -- # break 00:06:31.456 13:32:19 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:31.456 13:32:19 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:31.456 13:32:19 json_config -- json_config/common.sh@31 -- # local app=target 00:06:31.456 13:32:19 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:31.456 13:32:19 json_config -- json_config/common.sh@35 -- # [[ -n 388431 ]] 00:06:31.456 13:32:19 json_config -- json_config/common.sh@38 -- # kill -SIGINT 388431 00:06:31.456 13:32:20 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:31.456 13:32:20 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:31.456 13:32:20 json_config -- json_config/common.sh@41 -- # kill -0 388431 00:06:31.456 13:32:20 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:32.025 13:32:20 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:32.025 13:32:20 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:32.025 13:32:20 json_config -- json_config/common.sh@41 -- # kill -0 388431 00:06:32.025 13:32:20 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:32.025 13:32:20 json_config -- json_config/common.sh@43 -- # break 00:06:32.025 13:32:20 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:32.025 13:32:20 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:32.025 SPDK target shutdown done 00:06:32.025 13:32:20 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:32.025 INFO: relaunching applications... 00:06:32.025 13:32:20 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:32.025 13:32:20 json_config -- json_config/common.sh@9 -- # local app=target 00:06:32.025 13:32:20 json_config -- json_config/common.sh@10 -- # shift 00:06:32.025 13:32:20 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:32.025 13:32:20 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:32.025 13:32:20 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:32.025 13:32:20 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:32.025 13:32:20 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:32.025 13:32:20 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=391041 00:06:32.025 13:32:20 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:32.025 Waiting for target to run... 00:06:32.025 13:32:20 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:32.025 13:32:20 json_config -- json_config/common.sh@25 -- # waitforlisten 391041 /var/tmp/spdk_tgt.sock 00:06:32.025 13:32:20 json_config -- common/autotest_common.sh@829 -- # '[' -z 391041 ']' 00:06:32.025 13:32:20 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:32.025 13:32:20 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:32.026 13:32:20 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:32.026 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:32.026 13:32:20 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:32.026 13:32:20 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:32.026 [2024-07-12 13:32:20.583660] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:32.026 [2024-07-12 13:32:20.583741] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid391041 ] 00:06:32.961 [2024-07-12 13:32:21.233102] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.961 [2024-07-12 13:32:21.340263] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.961 [2024-07-12 13:32:21.394394] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:32.961 [2024-07-12 13:32:21.402430] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:32.961 [2024-07-12 13:32:21.410448] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:32.961 [2024-07-12 13:32:21.491859] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:35.493 [2024-07-12 13:32:23.700943] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:35.493 [2024-07-12 13:32:23.701004] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:35.493 [2024-07-12 13:32:23.701020] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:35.493 [2024-07-12 13:32:23.708956] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:35.493 [2024-07-12 13:32:23.708984] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:35.493 [2024-07-12 13:32:23.716967] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:35.493 [2024-07-12 13:32:23.716993] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:35.493 [2024-07-12 13:32:23.725002] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:35.493 [2024-07-12 13:32:23.725032] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:35.493 [2024-07-12 13:32:23.725046] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:35.753 [2024-07-12 13:32:24.096668] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:35.753 [2024-07-12 13:32:24.096715] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:35.753 [2024-07-12 13:32:24.096732] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ee24d0 00:06:35.753 [2024-07-12 13:32:24.096744] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:35.753 [2024-07-12 13:32:24.097047] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:35.753 [2024-07-12 13:32:24.097066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:35.753 13:32:24 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.753 13:32:24 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:35.753 13:32:24 json_config -- json_config/common.sh@26 -- # echo '' 00:06:35.753 00:06:35.753 13:32:24 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:06:35.753 13:32:24 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:35.753 INFO: Checking if target configuration is the same... 00:06:35.753 13:32:24 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:35.753 13:32:24 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:06:35.753 13:32:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:35.753 + '[' 2 -ne 2 ']' 00:06:35.753 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:35.753 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:35.753 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:35.753 +++ basename /dev/fd/62 00:06:35.753 ++ mktemp /tmp/62.XXX 00:06:35.753 + tmp_file_1=/tmp/62.uxc 00:06:35.753 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:35.753 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:35.753 + tmp_file_2=/tmp/spdk_tgt_config.json.CN4 00:06:35.753 + ret=0 00:06:35.753 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:36.322 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:36.322 + diff -u /tmp/62.uxc /tmp/spdk_tgt_config.json.CN4 00:06:36.322 + echo 'INFO: JSON config files are the same' 00:06:36.322 INFO: JSON config files are the same 00:06:36.322 + rm /tmp/62.uxc /tmp/spdk_tgt_config.json.CN4 00:06:36.322 + exit 0 00:06:36.322 13:32:24 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:06:36.322 13:32:24 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:06:36.322 INFO: changing configuration and checking if this can be detected... 00:06:36.322 13:32:24 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:36.322 13:32:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:36.322 13:32:24 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:36.322 13:32:24 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:06:36.322 13:32:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:36.580 + '[' 2 -ne 2 ']' 00:06:36.580 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:36.580 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:36.580 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:36.580 +++ basename /dev/fd/62 00:06:36.580 ++ mktemp /tmp/62.XXX 00:06:36.580 + tmp_file_1=/tmp/62.cyz 00:06:36.580 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:36.580 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:36.580 + tmp_file_2=/tmp/spdk_tgt_config.json.Jb7 00:06:36.580 + ret=0 00:06:36.580 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:36.839 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:36.839 + diff -u /tmp/62.cyz /tmp/spdk_tgt_config.json.Jb7 00:06:36.839 + ret=1 00:06:36.839 + echo '=== Start of file: /tmp/62.cyz ===' 00:06:36.839 + cat /tmp/62.cyz 00:06:36.839 + echo '=== End of file: /tmp/62.cyz ===' 00:06:36.839 + echo '' 00:06:36.839 + echo '=== Start of file: /tmp/spdk_tgt_config.json.Jb7 ===' 00:06:36.839 + cat /tmp/spdk_tgt_config.json.Jb7 00:06:36.839 + echo '=== End of file: /tmp/spdk_tgt_config.json.Jb7 ===' 00:06:36.839 + echo '' 00:06:36.839 + rm /tmp/62.cyz /tmp/spdk_tgt_config.json.Jb7 00:06:36.839 + exit 1 00:06:36.839 13:32:25 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:06:36.839 INFO: configuration change detected. 00:06:36.839 13:32:25 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:06:36.839 13:32:25 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:06:36.839 13:32:25 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:36.839 13:32:25 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:36.839 13:32:25 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:06:36.839 13:32:25 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:06:36.839 13:32:25 json_config -- json_config/json_config.sh@317 -- # [[ -n 391041 ]] 00:06:36.839 13:32:25 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:06:36.839 13:32:25 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:06:36.839 13:32:25 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:36.839 13:32:25 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:36.839 13:32:25 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:06:36.839 13:32:25 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:06:36.839 13:32:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:06:37.097 13:32:25 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:06:37.097 13:32:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:06:37.356 13:32:25 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:06:37.356 13:32:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:06:37.614 13:32:26 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:06:37.614 13:32:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:06:37.872 13:32:26 json_config -- json_config/json_config.sh@193 -- # uname -s 00:06:37.872 13:32:26 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:06:37.872 13:32:26 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:06:37.872 13:32:26 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:06:37.872 13:32:26 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:06:37.872 13:32:26 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:37.872 13:32:26 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.872 13:32:26 json_config -- json_config/json_config.sh@323 -- # killprocess 391041 00:06:37.872 13:32:26 json_config -- common/autotest_common.sh@948 -- # '[' -z 391041 ']' 00:06:37.872 13:32:26 json_config -- common/autotest_common.sh@952 -- # kill -0 391041 00:06:37.872 13:32:26 json_config -- common/autotest_common.sh@953 -- # uname 00:06:37.872 13:32:26 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:37.872 13:32:26 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 391041 00:06:38.132 13:32:26 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:38.132 13:32:26 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:38.132 13:32:26 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 391041' 00:06:38.132 killing process with pid 391041 00:06:38.132 13:32:26 json_config -- common/autotest_common.sh@967 -- # kill 391041 00:06:38.132 13:32:26 json_config -- common/autotest_common.sh@972 -- # wait 391041 00:06:41.419 13:32:29 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:41.419 13:32:29 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:06:41.419 13:32:29 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:41.419 13:32:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:41.419 13:32:29 json_config -- json_config/json_config.sh@328 -- # return 0 00:06:41.419 13:32:29 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:06:41.419 INFO: Success 00:06:41.419 00:06:41.419 real 0m28.398s 00:06:41.419 user 0m34.560s 00:06:41.419 sys 0m4.301s 00:06:41.419 13:32:29 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:41.419 13:32:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:41.419 ************************************ 00:06:41.419 END TEST json_config 00:06:41.419 ************************************ 00:06:41.419 13:32:29 -- common/autotest_common.sh@1142 -- # return 0 00:06:41.419 13:32:29 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:41.419 13:32:29 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:41.419 13:32:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:41.419 13:32:29 -- common/autotest_common.sh@10 -- # set +x 00:06:41.419 ************************************ 00:06:41.419 START TEST json_config_extra_key 00:06:41.419 ************************************ 00:06:41.419 13:32:29 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:41.419 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:41.419 13:32:29 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:41.419 13:32:29 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:41.419 13:32:29 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:41.419 13:32:29 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:41.419 13:32:29 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.419 13:32:29 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.419 13:32:29 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.419 13:32:29 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:41.420 13:32:29 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.420 13:32:29 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:06:41.420 13:32:29 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:41.420 13:32:29 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:41.420 13:32:29 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:41.420 13:32:29 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:41.420 13:32:29 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:41.420 13:32:29 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:41.420 13:32:29 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:41.420 13:32:29 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:41.420 INFO: launching applications... 00:06:41.420 13:32:29 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=392395 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:41.420 Waiting for target to run... 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 392395 /var/tmp/spdk_tgt.sock 00:06:41.420 13:32:29 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 392395 ']' 00:06:41.420 13:32:29 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:41.420 13:32:29 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:41.420 13:32:29 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:41.420 13:32:29 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:41.420 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:41.420 13:32:29 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:41.420 13:32:29 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:41.680 [2024-07-12 13:32:30.026195] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:41.680 [2024-07-12 13:32:30.026273] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid392395 ] 00:06:42.248 [2024-07-12 13:32:30.621020] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.248 [2024-07-12 13:32:30.727478] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.508 13:32:30 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:42.508 13:32:30 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:06:42.508 13:32:30 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:42.508 00:06:42.508 13:32:30 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:42.508 INFO: shutting down applications... 00:06:42.508 13:32:30 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:42.508 13:32:30 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:42.508 13:32:30 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:42.508 13:32:30 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 392395 ]] 00:06:42.508 13:32:30 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 392395 00:06:42.508 13:32:30 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:42.508 13:32:30 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:42.508 13:32:30 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 392395 00:06:42.508 13:32:30 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:43.076 13:32:31 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:43.076 13:32:31 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:43.076 13:32:31 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 392395 00:06:43.076 13:32:31 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:43.076 13:32:31 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:43.076 13:32:31 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:43.076 13:32:31 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:43.076 SPDK target shutdown done 00:06:43.076 13:32:31 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:43.076 Success 00:06:43.076 00:06:43.076 real 0m1.622s 00:06:43.076 user 0m1.096s 00:06:43.076 sys 0m0.727s 00:06:43.076 13:32:31 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:43.076 13:32:31 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:43.076 ************************************ 00:06:43.076 END TEST json_config_extra_key 00:06:43.076 ************************************ 00:06:43.076 13:32:31 -- common/autotest_common.sh@1142 -- # return 0 00:06:43.076 13:32:31 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:43.076 13:32:31 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:43.076 13:32:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:43.076 13:32:31 -- common/autotest_common.sh@10 -- # set +x 00:06:43.076 ************************************ 00:06:43.076 START TEST alias_rpc 00:06:43.076 ************************************ 00:06:43.076 13:32:31 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:43.076 * Looking for test storage... 00:06:43.336 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:06:43.336 13:32:31 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:43.336 13:32:31 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=392618 00:06:43.336 13:32:31 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:43.336 13:32:31 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 392618 00:06:43.336 13:32:31 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 392618 ']' 00:06:43.336 13:32:31 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:43.336 13:32:31 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:43.336 13:32:31 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:43.336 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:43.336 13:32:31 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:43.336 13:32:31 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:43.336 [2024-07-12 13:32:31.731448] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:43.336 [2024-07-12 13:32:31.731527] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid392618 ] 00:06:43.336 [2024-07-12 13:32:31.865390] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.595 [2024-07-12 13:32:31.967670] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.162 13:32:32 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:44.162 13:32:32 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:06:44.162 13:32:32 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:44.421 13:32:32 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 392618 00:06:44.421 13:32:32 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 392618 ']' 00:06:44.421 13:32:32 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 392618 00:06:44.421 13:32:32 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:06:44.421 13:32:32 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:44.421 13:32:32 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 392618 00:06:44.421 13:32:32 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:44.421 13:32:32 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:44.421 13:32:32 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 392618' 00:06:44.421 killing process with pid 392618 00:06:44.421 13:32:32 alias_rpc -- common/autotest_common.sh@967 -- # kill 392618 00:06:44.421 13:32:32 alias_rpc -- common/autotest_common.sh@972 -- # wait 392618 00:06:44.987 00:06:44.987 real 0m1.802s 00:06:44.987 user 0m1.959s 00:06:44.987 sys 0m0.598s 00:06:44.987 13:32:33 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:44.987 13:32:33 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:44.987 ************************************ 00:06:44.987 END TEST alias_rpc 00:06:44.987 ************************************ 00:06:44.987 13:32:33 -- common/autotest_common.sh@1142 -- # return 0 00:06:44.987 13:32:33 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:06:44.987 13:32:33 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:44.987 13:32:33 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:44.987 13:32:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.987 13:32:33 -- common/autotest_common.sh@10 -- # set +x 00:06:44.987 ************************************ 00:06:44.987 START TEST spdkcli_tcp 00:06:44.987 ************************************ 00:06:44.987 13:32:33 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:44.987 * Looking for test storage... 00:06:44.987 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:06:44.987 13:32:33 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:06:44.987 13:32:33 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:44.987 13:32:33 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:06:44.987 13:32:33 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:44.987 13:32:33 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:44.987 13:32:33 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:44.987 13:32:33 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:44.987 13:32:33 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:44.987 13:32:33 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:44.987 13:32:33 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=392934 00:06:44.987 13:32:33 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 392934 00:06:44.987 13:32:33 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:44.987 13:32:33 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 392934 ']' 00:06:44.987 13:32:33 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:44.987 13:32:33 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:44.987 13:32:33 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:44.987 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:44.987 13:32:33 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:44.987 13:32:33 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:45.244 [2024-07-12 13:32:33.624461] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:45.245 [2024-07-12 13:32:33.624539] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid392934 ] 00:06:45.245 [2024-07-12 13:32:33.745126] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:45.502 [2024-07-12 13:32:33.855142] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:45.502 [2024-07-12 13:32:33.855147] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.761 13:32:34 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:45.761 13:32:34 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:06:45.761 13:32:34 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=393031 00:06:45.761 13:32:34 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:45.761 13:32:34 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:46.019 [ 00:06:46.019 "bdev_malloc_delete", 00:06:46.019 "bdev_malloc_create", 00:06:46.019 "bdev_null_resize", 00:06:46.019 "bdev_null_delete", 00:06:46.019 "bdev_null_create", 00:06:46.019 "bdev_nvme_cuse_unregister", 00:06:46.019 "bdev_nvme_cuse_register", 00:06:46.019 "bdev_opal_new_user", 00:06:46.019 "bdev_opal_set_lock_state", 00:06:46.019 "bdev_opal_delete", 00:06:46.019 "bdev_opal_get_info", 00:06:46.019 "bdev_opal_create", 00:06:46.019 "bdev_nvme_opal_revert", 00:06:46.019 "bdev_nvme_opal_init", 00:06:46.019 "bdev_nvme_send_cmd", 00:06:46.019 "bdev_nvme_get_path_iostat", 00:06:46.019 "bdev_nvme_get_mdns_discovery_info", 00:06:46.019 "bdev_nvme_stop_mdns_discovery", 00:06:46.019 "bdev_nvme_start_mdns_discovery", 00:06:46.019 "bdev_nvme_set_multipath_policy", 00:06:46.019 "bdev_nvme_set_preferred_path", 00:06:46.019 "bdev_nvme_get_io_paths", 00:06:46.019 "bdev_nvme_remove_error_injection", 00:06:46.019 "bdev_nvme_add_error_injection", 00:06:46.019 "bdev_nvme_get_discovery_info", 00:06:46.019 "bdev_nvme_stop_discovery", 00:06:46.019 "bdev_nvme_start_discovery", 00:06:46.019 "bdev_nvme_get_controller_health_info", 00:06:46.019 "bdev_nvme_disable_controller", 00:06:46.019 "bdev_nvme_enable_controller", 00:06:46.019 "bdev_nvme_reset_controller", 00:06:46.019 "bdev_nvme_get_transport_statistics", 00:06:46.019 "bdev_nvme_apply_firmware", 00:06:46.019 "bdev_nvme_detach_controller", 00:06:46.019 "bdev_nvme_get_controllers", 00:06:46.019 "bdev_nvme_attach_controller", 00:06:46.019 "bdev_nvme_set_hotplug", 00:06:46.019 "bdev_nvme_set_options", 00:06:46.019 "bdev_passthru_delete", 00:06:46.019 "bdev_passthru_create", 00:06:46.019 "bdev_lvol_set_parent_bdev", 00:06:46.019 "bdev_lvol_set_parent", 00:06:46.019 "bdev_lvol_check_shallow_copy", 00:06:46.019 "bdev_lvol_start_shallow_copy", 00:06:46.019 "bdev_lvol_grow_lvstore", 00:06:46.019 "bdev_lvol_get_lvols", 00:06:46.019 "bdev_lvol_get_lvstores", 00:06:46.019 "bdev_lvol_delete", 00:06:46.019 "bdev_lvol_set_read_only", 00:06:46.019 "bdev_lvol_resize", 00:06:46.019 "bdev_lvol_decouple_parent", 00:06:46.019 "bdev_lvol_inflate", 00:06:46.019 "bdev_lvol_rename", 00:06:46.019 "bdev_lvol_clone_bdev", 00:06:46.019 "bdev_lvol_clone", 00:06:46.019 "bdev_lvol_snapshot", 00:06:46.019 "bdev_lvol_create", 00:06:46.019 "bdev_lvol_delete_lvstore", 00:06:46.019 "bdev_lvol_rename_lvstore", 00:06:46.019 "bdev_lvol_create_lvstore", 00:06:46.019 "bdev_raid_set_options", 00:06:46.019 "bdev_raid_remove_base_bdev", 00:06:46.019 "bdev_raid_add_base_bdev", 00:06:46.019 "bdev_raid_delete", 00:06:46.019 "bdev_raid_create", 00:06:46.019 "bdev_raid_get_bdevs", 00:06:46.019 "bdev_error_inject_error", 00:06:46.019 "bdev_error_delete", 00:06:46.019 "bdev_error_create", 00:06:46.019 "bdev_split_delete", 00:06:46.019 "bdev_split_create", 00:06:46.019 "bdev_delay_delete", 00:06:46.019 "bdev_delay_create", 00:06:46.019 "bdev_delay_update_latency", 00:06:46.019 "bdev_zone_block_delete", 00:06:46.019 "bdev_zone_block_create", 00:06:46.019 "blobfs_create", 00:06:46.019 "blobfs_detect", 00:06:46.019 "blobfs_set_cache_size", 00:06:46.019 "bdev_crypto_delete", 00:06:46.019 "bdev_crypto_create", 00:06:46.019 "bdev_compress_delete", 00:06:46.019 "bdev_compress_create", 00:06:46.019 "bdev_compress_get_orphans", 00:06:46.019 "bdev_aio_delete", 00:06:46.019 "bdev_aio_rescan", 00:06:46.019 "bdev_aio_create", 00:06:46.019 "bdev_ftl_set_property", 00:06:46.019 "bdev_ftl_get_properties", 00:06:46.019 "bdev_ftl_get_stats", 00:06:46.019 "bdev_ftl_unmap", 00:06:46.019 "bdev_ftl_unload", 00:06:46.019 "bdev_ftl_delete", 00:06:46.019 "bdev_ftl_load", 00:06:46.019 "bdev_ftl_create", 00:06:46.019 "bdev_virtio_attach_controller", 00:06:46.019 "bdev_virtio_scsi_get_devices", 00:06:46.019 "bdev_virtio_detach_controller", 00:06:46.019 "bdev_virtio_blk_set_hotplug", 00:06:46.019 "bdev_iscsi_delete", 00:06:46.019 "bdev_iscsi_create", 00:06:46.019 "bdev_iscsi_set_options", 00:06:46.019 "accel_error_inject_error", 00:06:46.019 "ioat_scan_accel_module", 00:06:46.019 "dsa_scan_accel_module", 00:06:46.019 "iaa_scan_accel_module", 00:06:46.019 "dpdk_cryptodev_get_driver", 00:06:46.019 "dpdk_cryptodev_set_driver", 00:06:46.019 "dpdk_cryptodev_scan_accel_module", 00:06:46.019 "compressdev_scan_accel_module", 00:06:46.019 "keyring_file_remove_key", 00:06:46.019 "keyring_file_add_key", 00:06:46.019 "keyring_linux_set_options", 00:06:46.019 "iscsi_get_histogram", 00:06:46.019 "iscsi_enable_histogram", 00:06:46.019 "iscsi_set_options", 00:06:46.019 "iscsi_get_auth_groups", 00:06:46.019 "iscsi_auth_group_remove_secret", 00:06:46.019 "iscsi_auth_group_add_secret", 00:06:46.019 "iscsi_delete_auth_group", 00:06:46.019 "iscsi_create_auth_group", 00:06:46.019 "iscsi_set_discovery_auth", 00:06:46.019 "iscsi_get_options", 00:06:46.019 "iscsi_target_node_request_logout", 00:06:46.019 "iscsi_target_node_set_redirect", 00:06:46.019 "iscsi_target_node_set_auth", 00:06:46.019 "iscsi_target_node_add_lun", 00:06:46.019 "iscsi_get_stats", 00:06:46.019 "iscsi_get_connections", 00:06:46.020 "iscsi_portal_group_set_auth", 00:06:46.020 "iscsi_start_portal_group", 00:06:46.020 "iscsi_delete_portal_group", 00:06:46.020 "iscsi_create_portal_group", 00:06:46.020 "iscsi_get_portal_groups", 00:06:46.020 "iscsi_delete_target_node", 00:06:46.020 "iscsi_target_node_remove_pg_ig_maps", 00:06:46.020 "iscsi_target_node_add_pg_ig_maps", 00:06:46.020 "iscsi_create_target_node", 00:06:46.020 "iscsi_get_target_nodes", 00:06:46.020 "iscsi_delete_initiator_group", 00:06:46.020 "iscsi_initiator_group_remove_initiators", 00:06:46.020 "iscsi_initiator_group_add_initiators", 00:06:46.020 "iscsi_create_initiator_group", 00:06:46.020 "iscsi_get_initiator_groups", 00:06:46.020 "nvmf_set_crdt", 00:06:46.020 "nvmf_set_config", 00:06:46.020 "nvmf_set_max_subsystems", 00:06:46.020 "nvmf_stop_mdns_prr", 00:06:46.020 "nvmf_publish_mdns_prr", 00:06:46.020 "nvmf_subsystem_get_listeners", 00:06:46.020 "nvmf_subsystem_get_qpairs", 00:06:46.020 "nvmf_subsystem_get_controllers", 00:06:46.020 "nvmf_get_stats", 00:06:46.020 "nvmf_get_transports", 00:06:46.020 "nvmf_create_transport", 00:06:46.020 "nvmf_get_targets", 00:06:46.020 "nvmf_delete_target", 00:06:46.020 "nvmf_create_target", 00:06:46.020 "nvmf_subsystem_allow_any_host", 00:06:46.020 "nvmf_subsystem_remove_host", 00:06:46.020 "nvmf_subsystem_add_host", 00:06:46.020 "nvmf_ns_remove_host", 00:06:46.020 "nvmf_ns_add_host", 00:06:46.020 "nvmf_subsystem_remove_ns", 00:06:46.020 "nvmf_subsystem_add_ns", 00:06:46.020 "nvmf_subsystem_listener_set_ana_state", 00:06:46.020 "nvmf_discovery_get_referrals", 00:06:46.020 "nvmf_discovery_remove_referral", 00:06:46.020 "nvmf_discovery_add_referral", 00:06:46.020 "nvmf_subsystem_remove_listener", 00:06:46.020 "nvmf_subsystem_add_listener", 00:06:46.020 "nvmf_delete_subsystem", 00:06:46.020 "nvmf_create_subsystem", 00:06:46.020 "nvmf_get_subsystems", 00:06:46.020 "env_dpdk_get_mem_stats", 00:06:46.020 "nbd_get_disks", 00:06:46.020 "nbd_stop_disk", 00:06:46.020 "nbd_start_disk", 00:06:46.020 "ublk_recover_disk", 00:06:46.020 "ublk_get_disks", 00:06:46.020 "ublk_stop_disk", 00:06:46.020 "ublk_start_disk", 00:06:46.020 "ublk_destroy_target", 00:06:46.020 "ublk_create_target", 00:06:46.020 "virtio_blk_create_transport", 00:06:46.020 "virtio_blk_get_transports", 00:06:46.020 "vhost_controller_set_coalescing", 00:06:46.020 "vhost_get_controllers", 00:06:46.020 "vhost_delete_controller", 00:06:46.020 "vhost_create_blk_controller", 00:06:46.020 "vhost_scsi_controller_remove_target", 00:06:46.020 "vhost_scsi_controller_add_target", 00:06:46.020 "vhost_start_scsi_controller", 00:06:46.020 "vhost_create_scsi_controller", 00:06:46.020 "thread_set_cpumask", 00:06:46.020 "framework_get_governor", 00:06:46.020 "framework_get_scheduler", 00:06:46.020 "framework_set_scheduler", 00:06:46.020 "framework_get_reactors", 00:06:46.020 "thread_get_io_channels", 00:06:46.020 "thread_get_pollers", 00:06:46.020 "thread_get_stats", 00:06:46.020 "framework_monitor_context_switch", 00:06:46.020 "spdk_kill_instance", 00:06:46.020 "log_enable_timestamps", 00:06:46.020 "log_get_flags", 00:06:46.020 "log_clear_flag", 00:06:46.020 "log_set_flag", 00:06:46.020 "log_get_level", 00:06:46.020 "log_set_level", 00:06:46.020 "log_get_print_level", 00:06:46.020 "log_set_print_level", 00:06:46.020 "framework_enable_cpumask_locks", 00:06:46.020 "framework_disable_cpumask_locks", 00:06:46.020 "framework_wait_init", 00:06:46.020 "framework_start_init", 00:06:46.020 "scsi_get_devices", 00:06:46.020 "bdev_get_histogram", 00:06:46.020 "bdev_enable_histogram", 00:06:46.020 "bdev_set_qos_limit", 00:06:46.020 "bdev_set_qd_sampling_period", 00:06:46.020 "bdev_get_bdevs", 00:06:46.020 "bdev_reset_iostat", 00:06:46.020 "bdev_get_iostat", 00:06:46.020 "bdev_examine", 00:06:46.020 "bdev_wait_for_examine", 00:06:46.020 "bdev_set_options", 00:06:46.020 "notify_get_notifications", 00:06:46.020 "notify_get_types", 00:06:46.020 "accel_get_stats", 00:06:46.020 "accel_set_options", 00:06:46.020 "accel_set_driver", 00:06:46.020 "accel_crypto_key_destroy", 00:06:46.020 "accel_crypto_keys_get", 00:06:46.020 "accel_crypto_key_create", 00:06:46.020 "accel_assign_opc", 00:06:46.020 "accel_get_module_info", 00:06:46.020 "accel_get_opc_assignments", 00:06:46.020 "vmd_rescan", 00:06:46.020 "vmd_remove_device", 00:06:46.020 "vmd_enable", 00:06:46.020 "sock_get_default_impl", 00:06:46.020 "sock_set_default_impl", 00:06:46.020 "sock_impl_set_options", 00:06:46.020 "sock_impl_get_options", 00:06:46.020 "iobuf_get_stats", 00:06:46.020 "iobuf_set_options", 00:06:46.020 "framework_get_pci_devices", 00:06:46.020 "framework_get_config", 00:06:46.020 "framework_get_subsystems", 00:06:46.020 "trace_get_info", 00:06:46.020 "trace_get_tpoint_group_mask", 00:06:46.020 "trace_disable_tpoint_group", 00:06:46.020 "trace_enable_tpoint_group", 00:06:46.020 "trace_clear_tpoint_mask", 00:06:46.020 "trace_set_tpoint_mask", 00:06:46.020 "keyring_get_keys", 00:06:46.020 "spdk_get_version", 00:06:46.020 "rpc_get_methods" 00:06:46.020 ] 00:06:46.020 13:32:34 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:46.020 13:32:34 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:46.020 13:32:34 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 392934 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 392934 ']' 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 392934 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 392934 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 392934' 00:06:46.020 killing process with pid 392934 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 392934 00:06:46.020 13:32:34 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 392934 00:06:46.587 00:06:46.587 real 0m1.429s 00:06:46.587 user 0m2.582s 00:06:46.587 sys 0m0.584s 00:06:46.587 13:32:34 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:46.587 13:32:34 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:46.587 ************************************ 00:06:46.587 END TEST spdkcli_tcp 00:06:46.587 ************************************ 00:06:46.587 13:32:34 -- common/autotest_common.sh@1142 -- # return 0 00:06:46.587 13:32:34 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:46.587 13:32:34 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:46.587 13:32:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.587 13:32:34 -- common/autotest_common.sh@10 -- # set +x 00:06:46.587 ************************************ 00:06:46.587 START TEST dpdk_mem_utility 00:06:46.587 ************************************ 00:06:46.587 13:32:34 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:46.587 * Looking for test storage... 00:06:46.587 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:46.587 13:32:35 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:46.587 13:32:35 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=393250 00:06:46.587 13:32:35 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 393250 00:06:46.587 13:32:35 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:46.587 13:32:35 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 393250 ']' 00:06:46.587 13:32:35 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:46.587 13:32:35 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:46.588 13:32:35 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:46.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:46.588 13:32:35 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:46.588 13:32:35 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:46.588 [2024-07-12 13:32:35.117953] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:46.588 [2024-07-12 13:32:35.118030] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid393250 ] 00:06:46.845 [2024-07-12 13:32:35.247012] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.845 [2024-07-12 13:32:35.352339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.785 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:47.785 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:06:47.785 13:32:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:47.785 13:32:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:47.785 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:47.785 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:47.785 { 00:06:47.785 "filename": "/tmp/spdk_mem_dump.txt" 00:06:47.785 } 00:06:47.785 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:47.785 13:32:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:47.785 DPDK memory size 816.000000 MiB in 2 heap(s) 00:06:47.785 2 heaps totaling size 816.000000 MiB 00:06:47.785 size: 814.000000 MiB heap id: 0 00:06:47.785 size: 2.000000 MiB heap id: 1 00:06:47.785 end heaps---------- 00:06:47.785 8 mempools totaling size 598.116089 MiB 00:06:47.785 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:47.785 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:47.785 size: 84.521057 MiB name: bdev_io_393250 00:06:47.785 size: 51.011292 MiB name: evtpool_393250 00:06:47.785 size: 50.003479 MiB name: msgpool_393250 00:06:47.785 size: 21.763794 MiB name: PDU_Pool 00:06:47.785 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:47.785 size: 0.026123 MiB name: Session_Pool 00:06:47.785 end mempools------- 00:06:47.785 201 memzones totaling size 4.176453 MiB 00:06:47.785 size: 1.000366 MiB name: RG_ring_0_393250 00:06:47.785 size: 1.000366 MiB name: RG_ring_1_393250 00:06:47.785 size: 1.000366 MiB name: RG_ring_4_393250 00:06:47.785 size: 1.000366 MiB name: RG_ring_5_393250 00:06:47.785 size: 0.125366 MiB name: RG_ring_2_393250 00:06:47.785 size: 0.015991 MiB name: RG_ring_3_393250 00:06:47.785 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:47.785 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:06:47.785 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:01.0_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:01.1_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:01.2_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:01.3_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:01.4_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:01.5_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:01.6_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:01.7_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:02.0_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:02.1_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:02.2_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:02.3_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:02.4_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:02.5_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:02.6_qat 00:06:47.785 size: 0.000305 MiB name: 0000:da:02.7_qat 00:06:47.785 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:47.785 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:47.785 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:47.786 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:47.786 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:47.786 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:47.786 end memzones------- 00:06:47.786 13:32:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:47.786 heap id: 0 total size: 814.000000 MiB number of busy elements: 547 number of free elements: 14 00:06:47.786 list of free elements. size: 11.809692 MiB 00:06:47.786 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:47.786 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:47.786 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:47.786 element at address: 0x200003e00000 with size: 0.996460 MiB 00:06:47.786 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:47.786 element at address: 0x200013800000 with size: 0.978882 MiB 00:06:47.786 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:47.786 element at address: 0x200019200000 with size: 0.937256 MiB 00:06:47.786 element at address: 0x20001aa00000 with size: 0.581055 MiB 00:06:47.786 element at address: 0x200003a00000 with size: 0.498535 MiB 00:06:47.786 element at address: 0x20000b200000 with size: 0.491272 MiB 00:06:47.786 element at address: 0x200000800000 with size: 0.486694 MiB 00:06:47.786 element at address: 0x200019400000 with size: 0.485840 MiB 00:06:47.786 element at address: 0x200027e00000 with size: 0.400146 MiB 00:06:47.786 list of standard malloc elements. size: 199.882019 MiB 00:06:47.786 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:47.786 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:47.786 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:47.786 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:47.786 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:47.786 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:47.786 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:47.786 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:47.786 element at address: 0x200000330b40 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000337640 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000033e140 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000344c40 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000034b740 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000352240 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000358d40 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000035f840 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000366880 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000036a340 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000036de00 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000375380 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000378e40 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000037c900 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000383e80 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000387940 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000038b400 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000392980 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000396440 with size: 0.004395 MiB 00:06:47.786 element at address: 0x200000399f00 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:06:47.786 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:06:47.786 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:06:47.786 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000333040 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000335540 with size: 0.004028 MiB 00:06:47.786 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000339b40 with size: 0.004028 MiB 00:06:47.786 element at address: 0x20000033c040 with size: 0.004028 MiB 00:06:47.786 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000340640 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000342b40 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000347140 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000349640 with size: 0.004028 MiB 00:06:47.786 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000350140 with size: 0.004028 MiB 00:06:47.786 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000354740 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000356c40 with size: 0.004028 MiB 00:06:47.786 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:06:47.786 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000035b240 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000035d740 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000361d40 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000364780 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000365800 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000368240 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000370840 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000373280 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000374300 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000376d40 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000037a800 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000037b880 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000037f340 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000381d80 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000382e00 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000385840 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000389300 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000038a380 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000038de40 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000390880 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000391900 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000394340 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000397e00 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000398e80 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000039c940 with size: 0.004028 MiB 00:06:47.787 element at address: 0x20000039f380 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:06:47.787 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:06:47.787 element at address: 0x200000204d40 with size: 0.000305 MiB 00:06:47.787 element at address: 0x200000200000 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200180 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200240 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200300 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200480 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200540 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200600 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200780 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200840 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200900 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200a80 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200b40 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200c00 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200d80 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200e40 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200f00 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201080 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201140 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201200 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201380 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201440 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201500 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201680 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201740 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201800 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201980 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201a40 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201b00 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201c80 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201d40 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201e00 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000201f80 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202040 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202100 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202280 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202340 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202400 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202580 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202640 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202700 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202880 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202940 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202a00 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202b80 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202c40 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202d00 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202e80 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000202f40 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203000 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203180 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203240 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203300 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203480 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203540 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203600 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203780 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203840 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203900 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203a80 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203b40 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203c00 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203d80 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203e40 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203f00 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000204080 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000204140 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000204200 with size: 0.000183 MiB 00:06:47.787 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000204380 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000204440 with size: 0.000183 MiB 00:06:47.787 element at address: 0x200000204500 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000204680 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000204740 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000204800 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000204980 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000204a40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000204b00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000204c80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000204e80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000204f40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205000 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205180 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205240 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205300 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205480 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205540 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205600 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205780 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205840 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205900 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205a80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205b40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205c00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205d80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205e40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205f00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000206080 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000206140 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000206200 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000020a780 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022af80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b040 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b100 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b280 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b340 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b400 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b580 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b640 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b700 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b900 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022be40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022c080 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022c140 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022c200 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022c380 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022c440 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000022c500 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000032e700 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000331d40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000338840 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000033f340 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000345e40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000034c940 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000353440 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000359f40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000360a40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000364180 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000364240 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000364400 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000367a80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000367c40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000367d00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000036b540 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000036b700 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000036b980 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000036f000 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000036f280 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000036f440 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000372c80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000372d40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000372f00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000376580 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000376740 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000376800 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000037a040 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000037a200 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000037a480 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000037db00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000037df40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000381780 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000381840 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000381a00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000385080 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000385240 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000385300 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000388b40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000388d00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000388f80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000038c600 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000038c880 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000390280 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000390340 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000390500 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000393b80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000393d40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000393e00 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000397640 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000397800 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x200000397a80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000039b100 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000039b380 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000039b540 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:06:47.788 element at address: 0x20000039f000 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:06:47.788 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20000087c980 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:47.789 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:47.789 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e66700 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e667c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6d3c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:06:47.789 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:06:47.790 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:06:47.790 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:06:47.790 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:06:47.790 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:06:47.790 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:47.790 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:47.790 list of memzone associated elements. size: 602.308289 MiB 00:06:47.790 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:47.790 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:47.790 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:47.790 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:47.790 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:47.790 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_393250_0 00:06:47.790 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:47.790 associated memzone info: size: 48.002930 MiB name: MP_evtpool_393250_0 00:06:47.790 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:47.790 associated memzone info: size: 48.002930 MiB name: MP_msgpool_393250_0 00:06:47.790 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:47.790 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:47.790 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:47.790 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:47.790 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:47.790 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_393250 00:06:47.790 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:47.790 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_393250 00:06:47.790 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:06:47.790 associated memzone info: size: 1.007996 MiB name: MP_evtpool_393250 00:06:47.790 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:47.790 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:47.790 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:47.790 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:47.790 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:47.790 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:47.790 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:47.790 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:47.790 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:47.790 associated memzone info: size: 1.000366 MiB name: RG_ring_0_393250 00:06:47.790 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:47.790 associated memzone info: size: 1.000366 MiB name: RG_ring_1_393250 00:06:47.790 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:47.790 associated memzone info: size: 1.000366 MiB name: RG_ring_4_393250 00:06:47.790 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:47.790 associated memzone info: size: 1.000366 MiB name: RG_ring_5_393250 00:06:47.790 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:06:47.790 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_393250 00:06:47.790 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:06:47.790 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:47.790 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:47.790 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:47.790 element at address: 0x20001947c600 with size: 0.250488 MiB 00:06:47.790 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:47.790 element at address: 0x20000020a840 with size: 0.125488 MiB 00:06:47.790 associated memzone info: size: 0.125366 MiB name: RG_ring_2_393250 00:06:47.790 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:47.790 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:47.790 element at address: 0x200027e66880 with size: 0.023743 MiB 00:06:47.790 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:47.790 element at address: 0x200000206580 with size: 0.016113 MiB 00:06:47.790 associated memzone info: size: 0.015991 MiB name: RG_ring_3_393250 00:06:47.790 element at address: 0x200027e6c9c0 with size: 0.002441 MiB 00:06:47.790 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:47.790 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:06:47.790 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:47.790 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:06:47.790 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:06:47.790 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:06:47.790 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:06:47.790 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:06:47.790 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:06:47.790 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:06:47.790 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:06:47.790 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:06:47.790 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:06:47.790 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:06:47.790 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:06:47.790 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:06:47.790 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:06:47.790 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:06:47.790 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:06:47.790 element at address: 0x20000039b700 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:06:47.790 element at address: 0x200000397c40 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:06:47.790 element at address: 0x200000394180 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:06:47.790 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:06:47.790 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:06:47.790 element at address: 0x200000389140 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:06:47.790 element at address: 0x200000385680 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:06:47.790 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:06:47.790 element at address: 0x20000037e100 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:06:47.790 element at address: 0x20000037a640 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:06:47.790 element at address: 0x200000376b80 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:06:47.790 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:06:47.790 element at address: 0x20000036f600 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:06:47.790 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:06:47.790 element at address: 0x200000368080 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:06:47.790 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:06:47.790 element at address: 0x200000360b00 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:06:47.790 element at address: 0x20000035d580 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:06:47.790 element at address: 0x20000035a000 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:06:47.790 element at address: 0x200000356a80 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:06:47.790 element at address: 0x200000353500 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:06:47.790 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:06:47.790 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:06:47.790 element at address: 0x200000349480 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:06:47.790 element at address: 0x200000345f00 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:06:47.790 element at address: 0x200000342980 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:06:47.790 element at address: 0x20000033f400 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:06:47.790 element at address: 0x20000033be80 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:06:47.790 element at address: 0x200000338900 with size: 0.000427 MiB 00:06:47.790 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:06:47.790 element at address: 0x200000335380 with size: 0.000427 MiB 00:06:47.791 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:06:47.791 element at address: 0x200000331e00 with size: 0.000427 MiB 00:06:47.791 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:06:47.791 element at address: 0x20000032e880 with size: 0.000427 MiB 00:06:47.791 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:06:47.791 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:06:47.791 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:47.791 element at address: 0x20000022b7c0 with size: 0.000305 MiB 00:06:47.791 associated memzone info: size: 0.000183 MiB name: MP_msgpool_393250 00:06:47.791 element at address: 0x200000206380 with size: 0.000305 MiB 00:06:47.791 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_393250 00:06:47.791 element at address: 0x200027e6d480 with size: 0.000305 MiB 00:06:47.791 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:47.791 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:47.791 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:47.791 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:06:47.791 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:47.791 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:47.791 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:06:47.791 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:47.791 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:47.791 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:06:47.791 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:47.791 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:47.791 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:06:47.791 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:47.791 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:47.791 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:06:47.791 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:47.791 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:47.791 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:06:47.791 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:47.791 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:47.791 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:06:47.791 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:47.791 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:47.791 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:06:47.791 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:47.791 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:47.791 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:06:47.791 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:47.791 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:47.791 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:06:47.791 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:47.791 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:47.791 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:06:47.791 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:47.791 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:47.791 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:06:47.791 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:47.791 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:47.791 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:06:47.791 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:47.791 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:47.791 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:06:47.791 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:47.791 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:47.791 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:06:47.791 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:47.791 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:47.791 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:06:47.791 element at address: 0x20000039b600 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:47.791 element at address: 0x20000039b440 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:47.791 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:06:47.791 element at address: 0x200000397b40 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:47.791 element at address: 0x200000397980 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:47.791 element at address: 0x200000397700 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:06:47.791 element at address: 0x200000394080 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:47.791 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:47.791 element at address: 0x200000393c40 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:06:47.791 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:47.791 element at address: 0x200000390400 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:47.791 element at address: 0x200000390180 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:06:47.791 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:47.791 element at address: 0x20000038c940 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:47.791 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:06:47.791 element at address: 0x200000389040 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:47.791 element at address: 0x200000388e80 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:47.791 element at address: 0x200000388c00 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:06:47.791 element at address: 0x200000385580 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:47.791 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:47.791 element at address: 0x200000385140 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:06:47.791 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:06:47.791 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:47.792 element at address: 0x200000381900 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:47.792 element at address: 0x200000381680 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:06:47.792 element at address: 0x20000037e000 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:47.792 element at address: 0x20000037de40 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:47.792 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:06:47.792 element at address: 0x20000037a540 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:47.792 element at address: 0x20000037a380 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:47.792 element at address: 0x20000037a100 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:06:47.792 element at address: 0x200000376a80 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:47.792 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:47.792 element at address: 0x200000376640 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:06:47.792 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:47.792 element at address: 0x200000372e00 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:47.792 element at address: 0x200000372b80 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:06:47.792 element at address: 0x20000036f500 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:47.792 element at address: 0x20000036f340 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:47.792 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:06:47.792 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:47.792 element at address: 0x20000036b880 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:47.792 element at address: 0x20000036b600 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:06:47.792 element at address: 0x200000367f80 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:47.792 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:47.792 element at address: 0x200000367b40 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:06:47.792 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:47.792 element at address: 0x200000364300 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:47.792 element at address: 0x200000364080 with size: 0.000244 MiB 00:06:47.792 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:06:47.792 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:06:47.792 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:47.792 13:32:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:47.792 13:32:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 393250 00:06:47.792 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 393250 ']' 00:06:47.792 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 393250 00:06:47.792 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:06:47.792 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:47.792 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 393250 00:06:47.792 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:47.792 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:47.792 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 393250' 00:06:47.792 killing process with pid 393250 00:06:47.792 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 393250 00:06:47.792 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 393250 00:06:48.361 00:06:48.361 real 0m1.762s 00:06:48.361 user 0m1.937s 00:06:48.361 sys 0m0.552s 00:06:48.361 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:48.361 13:32:36 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:48.361 ************************************ 00:06:48.361 END TEST dpdk_mem_utility 00:06:48.361 ************************************ 00:06:48.361 13:32:36 -- common/autotest_common.sh@1142 -- # return 0 00:06:48.361 13:32:36 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:48.361 13:32:36 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:48.361 13:32:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:48.361 13:32:36 -- common/autotest_common.sh@10 -- # set +x 00:06:48.361 ************************************ 00:06:48.361 START TEST event 00:06:48.361 ************************************ 00:06:48.361 13:32:36 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:48.361 * Looking for test storage... 00:06:48.361 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:48.361 13:32:36 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:48.361 13:32:36 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:48.361 13:32:36 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:48.361 13:32:36 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:06:48.361 13:32:36 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:48.361 13:32:36 event -- common/autotest_common.sh@10 -- # set +x 00:06:48.361 ************************************ 00:06:48.361 START TEST event_perf 00:06:48.361 ************************************ 00:06:48.361 13:32:36 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:48.620 Running I/O for 1 seconds...[2024-07-12 13:32:36.968504] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:48.620 [2024-07-12 13:32:36.968580] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid393501 ] 00:06:48.620 [2024-07-12 13:32:37.101408] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:48.878 [2024-07-12 13:32:37.211288] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:48.878 [2024-07-12 13:32:37.211388] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:48.878 [2024-07-12 13:32:37.211506] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:48.878 [2024-07-12 13:32:37.211507] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.812 Running I/O for 1 seconds... 00:06:49.812 lcore 0: 103859 00:06:49.812 lcore 1: 103861 00:06:49.812 lcore 2: 103864 00:06:49.812 lcore 3: 103860 00:06:49.812 done. 00:06:49.812 00:06:49.812 real 0m1.368s 00:06:49.812 user 0m4.205s 00:06:49.812 sys 0m0.150s 00:06:49.812 13:32:38 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:49.812 13:32:38 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:49.812 ************************************ 00:06:49.812 END TEST event_perf 00:06:49.812 ************************************ 00:06:49.812 13:32:38 event -- common/autotest_common.sh@1142 -- # return 0 00:06:49.812 13:32:38 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:49.812 13:32:38 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:49.812 13:32:38 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:49.813 13:32:38 event -- common/autotest_common.sh@10 -- # set +x 00:06:49.813 ************************************ 00:06:49.813 START TEST event_reactor 00:06:49.813 ************************************ 00:06:49.813 13:32:38 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:50.071 [2024-07-12 13:32:38.422124] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:50.071 [2024-07-12 13:32:38.422198] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid393705 ] 00:06:50.071 [2024-07-12 13:32:38.554531] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.330 [2024-07-12 13:32:38.659380] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.264 test_start 00:06:51.264 oneshot 00:06:51.264 tick 100 00:06:51.264 tick 100 00:06:51.264 tick 250 00:06:51.264 tick 100 00:06:51.264 tick 100 00:06:51.264 tick 100 00:06:51.264 tick 250 00:06:51.264 tick 500 00:06:51.264 tick 100 00:06:51.264 tick 100 00:06:51.264 tick 250 00:06:51.264 tick 100 00:06:51.264 tick 100 00:06:51.264 test_end 00:06:51.264 00:06:51.264 real 0m1.362s 00:06:51.264 user 0m1.217s 00:06:51.264 sys 0m0.138s 00:06:51.264 13:32:39 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:51.264 13:32:39 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:51.264 ************************************ 00:06:51.264 END TEST event_reactor 00:06:51.264 ************************************ 00:06:51.264 13:32:39 event -- common/autotest_common.sh@1142 -- # return 0 00:06:51.264 13:32:39 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:51.264 13:32:39 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:06:51.264 13:32:39 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.264 13:32:39 event -- common/autotest_common.sh@10 -- # set +x 00:06:51.264 ************************************ 00:06:51.264 START TEST event_reactor_perf 00:06:51.264 ************************************ 00:06:51.264 13:32:39 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:51.522 [2024-07-12 13:32:39.869945] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:51.522 [2024-07-12 13:32:39.870013] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid393899 ] 00:06:51.522 [2024-07-12 13:32:40.001348] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.779 [2024-07-12 13:32:40.117261] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.711 test_start 00:06:52.711 test_end 00:06:52.711 Performance: 326973 events per second 00:06:52.711 00:06:52.711 real 0m1.368s 00:06:52.711 user 0m1.208s 00:06:52.711 sys 0m0.152s 00:06:52.711 13:32:41 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:52.711 13:32:41 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:52.711 ************************************ 00:06:52.711 END TEST event_reactor_perf 00:06:52.711 ************************************ 00:06:52.711 13:32:41 event -- common/autotest_common.sh@1142 -- # return 0 00:06:52.711 13:32:41 event -- event/event.sh@49 -- # uname -s 00:06:52.711 13:32:41 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:52.711 13:32:41 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:52.711 13:32:41 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:52.711 13:32:41 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:52.711 13:32:41 event -- common/autotest_common.sh@10 -- # set +x 00:06:52.969 ************************************ 00:06:52.969 START TEST event_scheduler 00:06:52.969 ************************************ 00:06:52.969 13:32:41 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:52.969 * Looking for test storage... 00:06:52.969 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:52.969 13:32:41 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:52.969 13:32:41 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=394123 00:06:52.969 13:32:41 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:52.969 13:32:41 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 394123 00:06:52.969 13:32:41 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 394123 ']' 00:06:52.969 13:32:41 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:52.969 13:32:41 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:52.969 13:32:41 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:52.969 13:32:41 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:52.969 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:52.969 13:32:41 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:52.969 13:32:41 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:52.969 [2024-07-12 13:32:41.467159] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:52.969 [2024-07-12 13:32:41.467234] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid394123 ] 00:06:53.228 [2024-07-12 13:32:41.660623] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:53.486 [2024-07-12 13:32:41.839586] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.486 [2024-07-12 13:32:41.839666] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:53.486 [2024-07-12 13:32:41.839767] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:53.486 [2024-07-12 13:32:41.839775] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:06:54.053 13:32:42 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:54.053 [2024-07-12 13:32:42.414947] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:06:54.053 [2024-07-12 13:32:42.415005] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:06:54.053 [2024-07-12 13:32:42.415040] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:54.053 [2024-07-12 13:32:42.415066] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:54.053 [2024-07-12 13:32:42.415092] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.053 13:32:42 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:54.053 [2024-07-12 13:32:42.547785] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.053 13:32:42 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:54.053 13:32:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:54.053 ************************************ 00:06:54.053 START TEST scheduler_create_thread 00:06:54.053 ************************************ 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.053 2 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.053 3 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.053 4 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:54.053 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.054 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.313 5 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.313 6 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.313 7 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.313 8 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.313 9 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.313 10 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.313 13:32:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.572 13:32:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:54.572 13:32:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:54.572 13:32:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:54.572 13:32:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:54.572 13:32:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:55.507 13:32:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:55.507 13:32:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:55.507 13:32:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:55.507 13:32:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.442 13:32:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:56.442 13:32:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:56.442 13:32:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:56.442 13:32:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:56.442 13:32:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.377 13:32:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.377 00:06:57.377 real 0m3.233s 00:06:57.377 user 0m0.026s 00:06:57.377 sys 0m0.006s 00:06:57.377 13:32:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:57.377 13:32:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.377 ************************************ 00:06:57.377 END TEST scheduler_create_thread 00:06:57.377 ************************************ 00:06:57.377 13:32:45 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:06:57.377 13:32:45 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:57.378 13:32:45 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 394123 00:06:57.378 13:32:45 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 394123 ']' 00:06:57.378 13:32:45 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 394123 00:06:57.378 13:32:45 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:06:57.378 13:32:45 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:57.378 13:32:45 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 394123 00:06:57.378 13:32:45 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:06:57.378 13:32:45 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:06:57.378 13:32:45 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 394123' 00:06:57.378 killing process with pid 394123 00:06:57.378 13:32:45 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 394123 00:06:57.378 13:32:45 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 394123 00:06:57.636 [2024-07-12 13:32:46.206987] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:58.205 00:06:58.205 real 0m5.309s 00:06:58.205 user 0m10.375s 00:06:58.205 sys 0m0.615s 00:06:58.205 13:32:46 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.205 13:32:46 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:58.205 ************************************ 00:06:58.205 END TEST event_scheduler 00:06:58.205 ************************************ 00:06:58.205 13:32:46 event -- common/autotest_common.sh@1142 -- # return 0 00:06:58.205 13:32:46 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:58.205 13:32:46 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:58.205 13:32:46 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:58.205 13:32:46 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.205 13:32:46 event -- common/autotest_common.sh@10 -- # set +x 00:06:58.205 ************************************ 00:06:58.205 START TEST app_repeat 00:06:58.205 ************************************ 00:06:58.205 13:32:46 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@19 -- # repeat_pid=394874 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 394874' 00:06:58.205 Process app_repeat pid: 394874 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:58.205 spdk_app_start Round 0 00:06:58.205 13:32:46 event.app_repeat -- event/event.sh@25 -- # waitforlisten 394874 /var/tmp/spdk-nbd.sock 00:06:58.205 13:32:46 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 394874 ']' 00:06:58.205 13:32:46 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:58.205 13:32:46 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:58.205 13:32:46 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:58.205 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:58.205 13:32:46 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:58.205 13:32:46 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:58.205 [2024-07-12 13:32:46.744043] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:06:58.205 [2024-07-12 13:32:46.744110] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid394874 ] 00:06:58.464 [2024-07-12 13:32:46.877373] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:58.464 [2024-07-12 13:32:46.987954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:58.464 [2024-07-12 13:32:46.987960] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.400 13:32:47 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:59.400 13:32:47 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:06:59.400 13:32:47 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:59.400 Malloc0 00:06:59.400 13:32:47 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:59.659 Malloc1 00:06:59.918 13:32:48 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:59.918 13:32:48 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:59.918 /dev/nbd0 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:00.177 1+0 records in 00:07:00.177 1+0 records out 00:07:00.177 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237991 s, 17.2 MB/s 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:00.177 /dev/nbd1 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:00.177 1+0 records in 00:07:00.177 1+0 records out 00:07:00.177 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283512 s, 14.4 MB/s 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:00.177 13:32:48 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:00.177 13:32:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:00.436 13:32:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:00.436 { 00:07:00.436 "nbd_device": "/dev/nbd0", 00:07:00.436 "bdev_name": "Malloc0" 00:07:00.436 }, 00:07:00.436 { 00:07:00.436 "nbd_device": "/dev/nbd1", 00:07:00.436 "bdev_name": "Malloc1" 00:07:00.436 } 00:07:00.436 ]' 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:00.694 { 00:07:00.694 "nbd_device": "/dev/nbd0", 00:07:00.694 "bdev_name": "Malloc0" 00:07:00.694 }, 00:07:00.694 { 00:07:00.694 "nbd_device": "/dev/nbd1", 00:07:00.694 "bdev_name": "Malloc1" 00:07:00.694 } 00:07:00.694 ]' 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:00.694 /dev/nbd1' 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:00.694 /dev/nbd1' 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:00.694 256+0 records in 00:07:00.694 256+0 records out 00:07:00.694 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106991 s, 98.0 MB/s 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:00.694 256+0 records in 00:07:00.694 256+0 records out 00:07:00.694 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0299033 s, 35.1 MB/s 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:00.694 256+0 records in 00:07:00.694 256+0 records out 00:07:00.694 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0312689 s, 33.5 MB/s 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:00.694 13:32:49 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:00.695 13:32:49 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:00.695 13:32:49 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:00.695 13:32:49 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:00.695 13:32:49 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:00.953 13:32:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:00.953 13:32:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:00.953 13:32:49 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:00.953 13:32:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:00.953 13:32:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:00.953 13:32:49 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:00.953 13:32:49 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:00.953 13:32:49 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:00.953 13:32:49 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:00.953 13:32:49 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:01.211 13:32:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:01.470 13:32:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:01.470 13:32:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:01.470 13:32:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:01.470 13:32:50 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:01.470 13:32:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:01.470 13:32:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:01.470 13:32:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:01.470 13:32:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:01.470 13:32:50 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:01.470 13:32:50 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:01.470 13:32:50 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:01.470 13:32:50 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:01.470 13:32:50 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:01.728 13:32:50 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:01.988 [2024-07-12 13:32:50.493318] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:02.247 [2024-07-12 13:32:50.592606] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:02.247 [2024-07-12 13:32:50.592610] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.247 [2024-07-12 13:32:50.644761] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:02.247 [2024-07-12 13:32:50.644815] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:04.778 13:32:53 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:04.778 13:32:53 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:04.778 spdk_app_start Round 1 00:07:04.778 13:32:53 event.app_repeat -- event/event.sh@25 -- # waitforlisten 394874 /var/tmp/spdk-nbd.sock 00:07:04.778 13:32:53 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 394874 ']' 00:07:04.778 13:32:53 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:04.778 13:32:53 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:04.778 13:32:53 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:04.778 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:04.778 13:32:53 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:04.778 13:32:53 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:05.037 13:32:53 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:05.037 13:32:53 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:05.037 13:32:53 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:05.295 Malloc0 00:07:05.295 13:32:53 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:05.554 Malloc1 00:07:05.554 13:32:53 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:05.554 13:32:53 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.554 13:32:53 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:05.555 13:32:53 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:05.813 /dev/nbd0 00:07:05.813 13:32:54 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:05.813 13:32:54 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:05.813 13:32:54 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:05.813 13:32:54 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:05.813 13:32:54 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:05.814 1+0 records in 00:07:05.814 1+0 records out 00:07:05.814 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226069 s, 18.1 MB/s 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:05.814 13:32:54 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:05.814 13:32:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:05.814 13:32:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:05.814 13:32:54 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:06.072 /dev/nbd1 00:07:06.072 13:32:54 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:06.072 13:32:54 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:06.072 1+0 records in 00:07:06.072 1+0 records out 00:07:06.072 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278519 s, 14.7 MB/s 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:06.072 13:32:54 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:06.072 13:32:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:06.072 13:32:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:06.072 13:32:54 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:06.072 13:32:54 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:06.072 13:32:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:06.332 { 00:07:06.332 "nbd_device": "/dev/nbd0", 00:07:06.332 "bdev_name": "Malloc0" 00:07:06.332 }, 00:07:06.332 { 00:07:06.332 "nbd_device": "/dev/nbd1", 00:07:06.332 "bdev_name": "Malloc1" 00:07:06.332 } 00:07:06.332 ]' 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:06.332 { 00:07:06.332 "nbd_device": "/dev/nbd0", 00:07:06.332 "bdev_name": "Malloc0" 00:07:06.332 }, 00:07:06.332 { 00:07:06.332 "nbd_device": "/dev/nbd1", 00:07:06.332 "bdev_name": "Malloc1" 00:07:06.332 } 00:07:06.332 ]' 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:06.332 /dev/nbd1' 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:06.332 /dev/nbd1' 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:06.332 256+0 records in 00:07:06.332 256+0 records out 00:07:06.332 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114129 s, 91.9 MB/s 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:06.332 256+0 records in 00:07:06.332 256+0 records out 00:07:06.332 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0297692 s, 35.2 MB/s 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:06.332 256+0 records in 00:07:06.332 256+0 records out 00:07:06.332 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0207777 s, 50.5 MB/s 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:06.332 13:32:54 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:06.591 13:32:55 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:06.591 13:32:55 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:06.591 13:32:55 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:06.591 13:32:55 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:06.591 13:32:55 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:06.591 13:32:55 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:06.591 13:32:55 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:06.591 13:32:55 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:06.591 13:32:55 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:06.591 13:32:55 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:06.850 13:32:55 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:07.108 13:32:55 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:07.108 13:32:55 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:07.367 13:32:55 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:07.626 [2024-07-12 13:32:56.111242] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:07.626 [2024-07-12 13:32:56.206232] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:07.626 [2024-07-12 13:32:56.206236] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.885 [2024-07-12 13:32:56.253506] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:07.885 [2024-07-12 13:32:56.253557] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:10.416 13:32:58 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:10.416 13:32:58 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:10.416 spdk_app_start Round 2 00:07:10.416 13:32:58 event.app_repeat -- event/event.sh@25 -- # waitforlisten 394874 /var/tmp/spdk-nbd.sock 00:07:10.416 13:32:58 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 394874 ']' 00:07:10.416 13:32:58 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:10.416 13:32:58 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:10.416 13:32:58 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:10.416 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:10.416 13:32:58 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:10.416 13:32:58 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:10.675 13:32:59 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:10.675 13:32:59 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:10.675 13:32:59 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:10.934 Malloc0 00:07:10.935 13:32:59 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:11.195 Malloc1 00:07:11.195 13:32:59 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:11.195 13:32:59 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:11.454 /dev/nbd0 00:07:11.454 13:32:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:11.454 13:32:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:11.454 1+0 records in 00:07:11.454 1+0 records out 00:07:11.454 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00016838 s, 24.3 MB/s 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:11.454 13:32:59 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:11.454 13:32:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:11.454 13:32:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:11.454 13:32:59 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:11.713 /dev/nbd1 00:07:11.713 13:33:00 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:11.713 13:33:00 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:11.713 13:33:00 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:11.713 13:33:00 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:11.713 13:33:00 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:11.713 13:33:00 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:11.714 13:33:00 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:11.714 13:33:00 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:11.714 13:33:00 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:11.714 13:33:00 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:11.714 13:33:00 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:11.714 1+0 records in 00:07:11.714 1+0 records out 00:07:11.714 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021783 s, 18.8 MB/s 00:07:11.714 13:33:00 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:11.714 13:33:00 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:11.714 13:33:00 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:11.714 13:33:00 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:11.714 13:33:00 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:11.714 13:33:00 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:11.714 13:33:00 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:11.714 13:33:00 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:11.714 13:33:00 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:11.714 13:33:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:11.973 { 00:07:11.973 "nbd_device": "/dev/nbd0", 00:07:11.973 "bdev_name": "Malloc0" 00:07:11.973 }, 00:07:11.973 { 00:07:11.973 "nbd_device": "/dev/nbd1", 00:07:11.973 "bdev_name": "Malloc1" 00:07:11.973 } 00:07:11.973 ]' 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:11.973 { 00:07:11.973 "nbd_device": "/dev/nbd0", 00:07:11.973 "bdev_name": "Malloc0" 00:07:11.973 }, 00:07:11.973 { 00:07:11.973 "nbd_device": "/dev/nbd1", 00:07:11.973 "bdev_name": "Malloc1" 00:07:11.973 } 00:07:11.973 ]' 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:11.973 /dev/nbd1' 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:11.973 /dev/nbd1' 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:11.973 256+0 records in 00:07:11.973 256+0 records out 00:07:11.973 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106922 s, 98.1 MB/s 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:11.973 256+0 records in 00:07:11.973 256+0 records out 00:07:11.973 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0316222 s, 33.2 MB/s 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:11.973 256+0 records in 00:07:11.973 256+0 records out 00:07:11.973 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0310833 s, 33.7 MB/s 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:11.973 13:33:00 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:12.232 13:33:00 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:12.232 13:33:00 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:12.232 13:33:00 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:12.232 13:33:00 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:12.232 13:33:00 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:12.232 13:33:00 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:12.232 13:33:00 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:12.232 13:33:00 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:12.491 13:33:00 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:12.491 13:33:00 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:12.491 13:33:00 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:12.491 13:33:00 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:12.491 13:33:00 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:12.491 13:33:00 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:12.491 13:33:00 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:12.491 13:33:00 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:12.491 13:33:00 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:12.491 13:33:00 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:12.752 13:33:01 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:13.011 13:33:01 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:13.011 13:33:01 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:13.011 13:33:01 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:13.011 13:33:01 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:13.011 13:33:01 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:13.011 13:33:01 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:13.011 13:33:01 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:13.011 13:33:01 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:13.011 13:33:01 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:13.011 13:33:01 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:13.270 13:33:01 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:13.530 [2024-07-12 13:33:01.860966] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:13.530 [2024-07-12 13:33:01.961116] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:13.530 [2024-07-12 13:33:01.961120] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.530 [2024-07-12 13:33:02.012226] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:13.530 [2024-07-12 13:33:02.012284] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:16.064 13:33:04 event.app_repeat -- event/event.sh@38 -- # waitforlisten 394874 /var/tmp/spdk-nbd.sock 00:07:16.064 13:33:04 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 394874 ']' 00:07:16.064 13:33:04 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:16.064 13:33:04 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:16.064 13:33:04 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:16.064 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:16.064 13:33:04 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:16.064 13:33:04 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:16.323 13:33:04 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:16.323 13:33:04 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:16.323 13:33:04 event.app_repeat -- event/event.sh@39 -- # killprocess 394874 00:07:16.323 13:33:04 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 394874 ']' 00:07:16.323 13:33:04 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 394874 00:07:16.323 13:33:04 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:16.323 13:33:04 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:16.323 13:33:04 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 394874 00:07:16.581 13:33:04 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:16.581 13:33:04 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:16.582 13:33:04 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 394874' 00:07:16.582 killing process with pid 394874 00:07:16.582 13:33:04 event.app_repeat -- common/autotest_common.sh@967 -- # kill 394874 00:07:16.582 13:33:04 event.app_repeat -- common/autotest_common.sh@972 -- # wait 394874 00:07:16.582 spdk_app_start is called in Round 0. 00:07:16.582 Shutdown signal received, stop current app iteration 00:07:16.582 Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 reinitialization... 00:07:16.582 spdk_app_start is called in Round 1. 00:07:16.582 Shutdown signal received, stop current app iteration 00:07:16.582 Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 reinitialization... 00:07:16.582 spdk_app_start is called in Round 2. 00:07:16.582 Shutdown signal received, stop current app iteration 00:07:16.582 Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 reinitialization... 00:07:16.582 spdk_app_start is called in Round 3. 00:07:16.582 Shutdown signal received, stop current app iteration 00:07:16.582 13:33:05 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:16.582 13:33:05 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:16.582 00:07:16.582 real 0m18.460s 00:07:16.582 user 0m39.649s 00:07:16.582 sys 0m3.855s 00:07:16.582 13:33:05 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.582 13:33:05 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:16.582 ************************************ 00:07:16.582 END TEST app_repeat 00:07:16.582 ************************************ 00:07:16.840 13:33:05 event -- common/autotest_common.sh@1142 -- # return 0 00:07:16.840 13:33:05 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:16.840 00:07:16.840 real 0m28.417s 00:07:16.840 user 0m56.868s 00:07:16.840 sys 0m5.289s 00:07:16.840 13:33:05 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.840 13:33:05 event -- common/autotest_common.sh@10 -- # set +x 00:07:16.840 ************************************ 00:07:16.840 END TEST event 00:07:16.840 ************************************ 00:07:16.840 13:33:05 -- common/autotest_common.sh@1142 -- # return 0 00:07:16.840 13:33:05 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:16.840 13:33:05 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:16.841 13:33:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:16.841 13:33:05 -- common/autotest_common.sh@10 -- # set +x 00:07:16.841 ************************************ 00:07:16.841 START TEST thread 00:07:16.841 ************************************ 00:07:16.841 13:33:05 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:16.841 * Looking for test storage... 00:07:16.841 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:16.841 13:33:05 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:16.841 13:33:05 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:16.841 13:33:05 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:16.841 13:33:05 thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.099 ************************************ 00:07:17.099 START TEST thread_poller_perf 00:07:17.099 ************************************ 00:07:17.099 13:33:05 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:17.099 [2024-07-12 13:33:05.464865] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:17.099 [2024-07-12 13:33:05.464951] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid397963 ] 00:07:17.099 [2024-07-12 13:33:05.595916] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.357 [2024-07-12 13:33:05.698731] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.357 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:18.304 ====================================== 00:07:18.304 busy:2314351538 (cyc) 00:07:18.304 total_run_count: 267000 00:07:18.304 tsc_hz: 2300000000 (cyc) 00:07:18.304 ====================================== 00:07:18.304 poller_cost: 8667 (cyc), 3768 (nsec) 00:07:18.304 00:07:18.304 real 0m1.368s 00:07:18.304 user 0m1.222s 00:07:18.304 sys 0m0.139s 00:07:18.304 13:33:06 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:18.304 13:33:06 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:18.304 ************************************ 00:07:18.304 END TEST thread_poller_perf 00:07:18.304 ************************************ 00:07:18.304 13:33:06 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:18.304 13:33:06 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:18.304 13:33:06 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:18.304 13:33:06 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.304 13:33:06 thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.304 ************************************ 00:07:18.304 START TEST thread_poller_perf 00:07:18.304 ************************************ 00:07:18.304 13:33:06 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:18.563 [2024-07-12 13:33:06.912444] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:18.563 [2024-07-12 13:33:06.912506] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid398272 ] 00:07:18.563 [2024-07-12 13:33:07.038865] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.563 [2024-07-12 13:33:07.137396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.563 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:19.939 ====================================== 00:07:19.939 busy:2302551266 (cyc) 00:07:19.939 total_run_count: 3495000 00:07:19.939 tsc_hz: 2300000000 (cyc) 00:07:19.939 ====================================== 00:07:19.939 poller_cost: 658 (cyc), 286 (nsec) 00:07:19.939 00:07:19.939 real 0m1.342s 00:07:19.939 user 0m1.206s 00:07:19.939 sys 0m0.129s 00:07:19.939 13:33:08 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:19.939 13:33:08 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:19.939 ************************************ 00:07:19.939 END TEST thread_poller_perf 00:07:19.939 ************************************ 00:07:19.939 13:33:08 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:19.939 13:33:08 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:19.939 00:07:19.939 real 0m2.974s 00:07:19.939 user 0m2.518s 00:07:19.939 sys 0m0.464s 00:07:19.939 13:33:08 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:19.939 13:33:08 thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.939 ************************************ 00:07:19.939 END TEST thread 00:07:19.939 ************************************ 00:07:19.939 13:33:08 -- common/autotest_common.sh@1142 -- # return 0 00:07:19.939 13:33:08 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:19.939 13:33:08 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:19.939 13:33:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.939 13:33:08 -- common/autotest_common.sh@10 -- # set +x 00:07:19.939 ************************************ 00:07:19.939 START TEST accel 00:07:19.939 ************************************ 00:07:19.939 13:33:08 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:19.939 * Looking for test storage... 00:07:19.939 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:19.939 13:33:08 accel -- accel/accel.sh@95 -- # declare -A expected_opcs 00:07:19.939 13:33:08 accel -- accel/accel.sh@96 -- # get_expected_opcs 00:07:19.939 13:33:08 accel -- accel/accel.sh@69 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:19.939 13:33:08 accel -- accel/accel.sh@71 -- # spdk_tgt_pid=398506 00:07:19.939 13:33:08 accel -- accel/accel.sh@72 -- # waitforlisten 398506 00:07:19.939 13:33:08 accel -- common/autotest_common.sh@829 -- # '[' -z 398506 ']' 00:07:19.939 13:33:08 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:19.939 13:33:08 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:19.939 13:33:08 accel -- accel/accel.sh@70 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:19.939 13:33:08 accel -- accel/accel.sh@70 -- # build_accel_config 00:07:19.939 13:33:08 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:19.939 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:19.939 13:33:08 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:19.939 13:33:08 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:19.939 13:33:08 accel -- common/autotest_common.sh@10 -- # set +x 00:07:19.939 13:33:08 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:19.939 13:33:08 accel -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:19.939 13:33:08 accel -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:19.939 13:33:08 accel -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:19.939 13:33:08 accel -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:19.939 13:33:08 accel -- accel/accel.sh@49 -- # local IFS=, 00:07:19.939 13:33:08 accel -- accel/accel.sh@50 -- # jq -r . 00:07:19.939 [2024-07-12 13:33:08.521486] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:19.939 [2024-07-12 13:33:08.521559] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid398506 ] 00:07:20.199 [2024-07-12 13:33:08.651911] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.199 [2024-07-12 13:33:08.756530] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@862 -- # return 0 00:07:21.137 13:33:09 accel -- accel/accel.sh@74 -- # [[ 0 -gt 0 ]] 00:07:21.137 13:33:09 accel -- accel/accel.sh@77 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:21.137 13:33:09 accel -- accel/accel.sh@78 -- # [[ 0 -gt 0 ]] 00:07:21.137 13:33:09 accel -- accel/accel.sh@81 -- # [[ 0 -gt 0 ]] 00:07:21.137 13:33:09 accel -- accel/accel.sh@82 -- # [[ -n '' ]] 00:07:21.137 13:33:09 accel -- accel/accel.sh@84 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:21.137 13:33:09 accel -- accel/accel.sh@84 -- # rpc_cmd accel_get_opc_assignments 00:07:21.137 13:33:09 accel -- accel/accel.sh@84 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@10 -- # set +x 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # IFS== 00:07:21.137 13:33:09 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:21.137 13:33:09 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:21.137 13:33:09 accel -- accel/accel.sh@89 -- # killprocess 398506 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@948 -- # '[' -z 398506 ']' 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@952 -- # kill -0 398506 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@953 -- # uname 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 398506 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 398506' 00:07:21.137 killing process with pid 398506 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@967 -- # kill 398506 00:07:21.137 13:33:09 accel -- common/autotest_common.sh@972 -- # wait 398506 00:07:21.397 13:33:09 accel -- accel/accel.sh@90 -- # trap - ERR 00:07:21.397 13:33:09 accel -- accel/accel.sh@103 -- # run_test accel_help accel_perf -h 00:07:21.397 13:33:09 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:21.397 13:33:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.397 13:33:09 accel -- common/autotest_common.sh@10 -- # set +x 00:07:21.397 13:33:09 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:07:21.397 13:33:09 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:21.397 13:33:09 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:21.397 13:33:09 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:21.397 13:33:09 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:21.397 13:33:09 accel.accel_help -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:21.397 13:33:09 accel.accel_help -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:21.397 13:33:09 accel.accel_help -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:21.397 13:33:09 accel.accel_help -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:21.397 13:33:09 accel.accel_help -- accel/accel.sh@49 -- # local IFS=, 00:07:21.397 13:33:09 accel.accel_help -- accel/accel.sh@50 -- # jq -r . 00:07:21.656 13:33:10 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:21.656 13:33:10 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:21.656 13:33:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:21.656 13:33:10 accel -- accel/accel.sh@105 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:21.656 13:33:10 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:21.656 13:33:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.656 13:33:10 accel -- common/autotest_common.sh@10 -- # set +x 00:07:21.656 ************************************ 00:07:21.656 START TEST accel_missing_filename 00:07:21.656 ************************************ 00:07:21.656 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:07:21.656 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:07:21.656 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:21.656 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:21.656 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:21.656 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:21.656 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:21.656 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:21.656 13:33:10 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:21.656 13:33:10 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:21.656 13:33:10 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:21.656 13:33:10 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:21.656 13:33:10 accel.accel_missing_filename -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:21.656 13:33:10 accel.accel_missing_filename -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:21.656 13:33:10 accel.accel_missing_filename -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:21.656 13:33:10 accel.accel_missing_filename -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:21.656 13:33:10 accel.accel_missing_filename -- accel/accel.sh@49 -- # local IFS=, 00:07:21.656 13:33:10 accel.accel_missing_filename -- accel/accel.sh@50 -- # jq -r . 00:07:21.656 [2024-07-12 13:33:10.124952] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:21.657 [2024-07-12 13:33:10.125013] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid398842 ] 00:07:21.915 [2024-07-12 13:33:10.254083] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.915 [2024-07-12 13:33:10.354117] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.915 [2024-07-12 13:33:10.423662] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:21.915 [2024-07-12 13:33:10.497897] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:22.175 A filename is required. 00:07:22.175 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:07:22.175 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:22.175 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:07:22.175 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:07:22.175 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:07:22.175 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:22.175 00:07:22.175 real 0m0.506s 00:07:22.175 user 0m0.335s 00:07:22.175 sys 0m0.196s 00:07:22.175 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:22.175 13:33:10 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:22.175 ************************************ 00:07:22.175 END TEST accel_missing_filename 00:07:22.175 ************************************ 00:07:22.175 13:33:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:22.175 13:33:10 accel -- accel/accel.sh@107 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:22.175 13:33:10 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:22.175 13:33:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.175 13:33:10 accel -- common/autotest_common.sh@10 -- # set +x 00:07:22.175 ************************************ 00:07:22.175 START TEST accel_compress_verify 00:07:22.175 ************************************ 00:07:22.175 13:33:10 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:22.175 13:33:10 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:07:22.175 13:33:10 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:22.175 13:33:10 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:22.175 13:33:10 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:22.175 13:33:10 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:22.175 13:33:10 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:22.175 13:33:10 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:22.175 13:33:10 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:22.175 13:33:10 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:22.175 13:33:10 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:22.175 13:33:10 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:22.175 13:33:10 accel.accel_compress_verify -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:22.175 13:33:10 accel.accel_compress_verify -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:22.175 13:33:10 accel.accel_compress_verify -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:22.175 13:33:10 accel.accel_compress_verify -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:22.175 13:33:10 accel.accel_compress_verify -- accel/accel.sh@49 -- # local IFS=, 00:07:22.175 13:33:10 accel.accel_compress_verify -- accel/accel.sh@50 -- # jq -r . 00:07:22.175 [2024-07-12 13:33:10.716375] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:22.175 [2024-07-12 13:33:10.716440] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid398909 ] 00:07:22.434 [2024-07-12 13:33:10.848133] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.435 [2024-07-12 13:33:10.955026] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.694 [2024-07-12 13:33:11.018338] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:22.694 [2024-07-12 13:33:11.091373] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:22.694 00:07:22.694 Compression does not support the verify option, aborting. 00:07:22.694 13:33:11 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:07:22.694 13:33:11 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:22.694 13:33:11 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:07:22.694 13:33:11 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:07:22.694 13:33:11 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:07:22.694 13:33:11 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:22.694 00:07:22.694 real 0m0.510s 00:07:22.694 user 0m0.346s 00:07:22.694 sys 0m0.198s 00:07:22.694 13:33:11 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:22.694 13:33:11 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:22.694 ************************************ 00:07:22.694 END TEST accel_compress_verify 00:07:22.694 ************************************ 00:07:22.694 13:33:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:22.694 13:33:11 accel -- accel/accel.sh@109 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:22.694 13:33:11 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:22.694 13:33:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.694 13:33:11 accel -- common/autotest_common.sh@10 -- # set +x 00:07:22.694 ************************************ 00:07:22.694 START TEST accel_wrong_workload 00:07:22.694 ************************************ 00:07:22.694 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:07:22.694 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:07:22.694 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:22.694 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:22.694 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:22.694 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:22.694 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:22.694 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:22.694 13:33:11 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:22.694 13:33:11 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:22.694 13:33:11 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:22.694 13:33:11 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:22.953 13:33:11 accel.accel_wrong_workload -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:22.953 13:33:11 accel.accel_wrong_workload -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:22.953 13:33:11 accel.accel_wrong_workload -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:22.953 13:33:11 accel.accel_wrong_workload -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:22.953 13:33:11 accel.accel_wrong_workload -- accel/accel.sh@49 -- # local IFS=, 00:07:22.953 13:33:11 accel.accel_wrong_workload -- accel/accel.sh@50 -- # jq -r . 00:07:22.953 Unsupported workload type: foobar 00:07:22.953 [2024-07-12 13:33:11.299821] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:22.953 accel_perf options: 00:07:22.953 [-h help message] 00:07:22.953 [-q queue depth per core] 00:07:22.954 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:22.954 [-T number of threads per core 00:07:22.954 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:22.954 [-t time in seconds] 00:07:22.954 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:22.954 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:22.954 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:22.954 [-l for compress/decompress workloads, name of uncompressed input file 00:07:22.954 [-S for crc32c workload, use this seed value (default 0) 00:07:22.954 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:22.954 [-f for fill workload, use this BYTE value (default 255) 00:07:22.954 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:22.954 [-y verify result if this switch is on] 00:07:22.954 [-a tasks to allocate per core (default: same value as -q)] 00:07:22.954 Can be used to spread operations across a wider range of memory. 00:07:22.954 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:07:22.954 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:22.954 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:22.954 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:22.954 00:07:22.954 real 0m0.039s 00:07:22.954 user 0m0.021s 00:07:22.954 sys 0m0.017s 00:07:22.954 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:22.954 13:33:11 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:22.954 ************************************ 00:07:22.954 END TEST accel_wrong_workload 00:07:22.954 ************************************ 00:07:22.954 Error: writing output failed: Broken pipe 00:07:22.954 13:33:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:22.954 13:33:11 accel -- accel/accel.sh@111 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:22.954 13:33:11 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:22.954 13:33:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.954 13:33:11 accel -- common/autotest_common.sh@10 -- # set +x 00:07:22.954 ************************************ 00:07:22.954 START TEST accel_negative_buffers 00:07:22.954 ************************************ 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:22.954 13:33:11 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:22.954 13:33:11 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:22.954 13:33:11 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:22.954 13:33:11 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:22.954 13:33:11 accel.accel_negative_buffers -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:22.954 13:33:11 accel.accel_negative_buffers -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:22.954 13:33:11 accel.accel_negative_buffers -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:22.954 13:33:11 accel.accel_negative_buffers -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:22.954 13:33:11 accel.accel_negative_buffers -- accel/accel.sh@49 -- # local IFS=, 00:07:22.954 13:33:11 accel.accel_negative_buffers -- accel/accel.sh@50 -- # jq -r . 00:07:22.954 -x option must be non-negative. 00:07:22.954 [2024-07-12 13:33:11.425169] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:22.954 accel_perf options: 00:07:22.954 [-h help message] 00:07:22.954 [-q queue depth per core] 00:07:22.954 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:22.954 [-T number of threads per core 00:07:22.954 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:22.954 [-t time in seconds] 00:07:22.954 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:22.954 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:22.954 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:22.954 [-l for compress/decompress workloads, name of uncompressed input file 00:07:22.954 [-S for crc32c workload, use this seed value (default 0) 00:07:22.954 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:22.954 [-f for fill workload, use this BYTE value (default 255) 00:07:22.954 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:22.954 [-y verify result if this switch is on] 00:07:22.954 [-a tasks to allocate per core (default: same value as -q)] 00:07:22.954 Can be used to spread operations across a wider range of memory. 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:22.954 00:07:22.954 real 0m0.045s 00:07:22.954 user 0m0.023s 00:07:22.954 sys 0m0.022s 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:22.954 13:33:11 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:22.954 ************************************ 00:07:22.954 END TEST accel_negative_buffers 00:07:22.954 ************************************ 00:07:22.954 Error: writing output failed: Broken pipe 00:07:22.954 13:33:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:22.954 13:33:11 accel -- accel/accel.sh@115 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:22.954 13:33:11 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:22.954 13:33:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.954 13:33:11 accel -- common/autotest_common.sh@10 -- # set +x 00:07:22.954 ************************************ 00:07:22.954 START TEST accel_crc32c 00:07:22.954 ************************************ 00:07:22.954 13:33:11 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@49 -- # local IFS=, 00:07:22.954 13:33:11 accel.accel_crc32c -- accel/accel.sh@50 -- # jq -r . 00:07:23.213 [2024-07-12 13:33:11.558192] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:23.213 [2024-07-12 13:33:11.558320] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid399120 ] 00:07:23.213 [2024-07-12 13:33:11.752955] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.473 [2024-07-12 13:33:11.859169] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:23.473 13:33:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:24.851 13:33:13 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:24.851 00:07:24.851 real 0m1.597s 00:07:24.851 user 0m1.334s 00:07:24.851 sys 0m0.264s 00:07:24.851 13:33:13 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:24.851 13:33:13 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:24.851 ************************************ 00:07:24.851 END TEST accel_crc32c 00:07:24.851 ************************************ 00:07:24.851 13:33:13 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:24.851 13:33:13 accel -- accel/accel.sh@116 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:24.851 13:33:13 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:24.851 13:33:13 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:24.851 13:33:13 accel -- common/autotest_common.sh@10 -- # set +x 00:07:24.851 ************************************ 00:07:24.851 START TEST accel_crc32c_C2 00:07:24.851 ************************************ 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@49 -- # local IFS=, 00:07:24.851 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@50 -- # jq -r . 00:07:24.851 [2024-07-12 13:33:13.223498] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:24.851 [2024-07-12 13:33:13.223559] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid399337 ] 00:07:24.851 [2024-07-12 13:33:13.353500] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.109 [2024-07-12 13:33:13.454965] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.109 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.109 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.109 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.109 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.109 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:25.110 13:33:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:26.487 00:07:26.487 real 0m1.507s 00:07:26.487 user 0m1.312s 00:07:26.487 sys 0m0.200s 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:26.487 13:33:14 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:26.487 ************************************ 00:07:26.487 END TEST accel_crc32c_C2 00:07:26.487 ************************************ 00:07:26.487 13:33:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:26.487 13:33:14 accel -- accel/accel.sh@117 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:26.487 13:33:14 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:26.487 13:33:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.487 13:33:14 accel -- common/autotest_common.sh@10 -- # set +x 00:07:26.487 ************************************ 00:07:26.487 START TEST accel_copy 00:07:26.487 ************************************ 00:07:26.487 13:33:14 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@49 -- # local IFS=, 00:07:26.487 13:33:14 accel.accel_copy -- accel/accel.sh@50 -- # jq -r . 00:07:26.487 [2024-07-12 13:33:14.809264] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:26.487 [2024-07-12 13:33:14.809326] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid399528 ] 00:07:26.487 [2024-07-12 13:33:14.938214] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.487 [2024-07-12 13:33:15.038404] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:26.746 13:33:15 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:27.683 13:33:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:27.683 13:33:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:27.683 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:27.683 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:27.683 13:33:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:27.683 13:33:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:27.683 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:27.942 13:33:16 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:27.942 00:07:27.942 real 0m1.497s 00:07:27.942 user 0m1.313s 00:07:27.942 sys 0m0.190s 00:07:27.942 13:33:16 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:27.942 13:33:16 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:27.942 ************************************ 00:07:27.942 END TEST accel_copy 00:07:27.942 ************************************ 00:07:27.942 13:33:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:27.942 13:33:16 accel -- accel/accel.sh@118 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:27.942 13:33:16 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:27.942 13:33:16 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.942 13:33:16 accel -- common/autotest_common.sh@10 -- # set +x 00:07:27.942 ************************************ 00:07:27.942 START TEST accel_fill 00:07:27.942 ************************************ 00:07:27.942 13:33:16 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@49 -- # local IFS=, 00:07:27.942 13:33:16 accel.accel_fill -- accel/accel.sh@50 -- # jq -r . 00:07:27.942 [2024-07-12 13:33:16.391859] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:27.942 [2024-07-12 13:33:16.391997] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid399734 ] 00:07:28.202 [2024-07-12 13:33:16.585918] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.202 [2024-07-12 13:33:16.689703] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:28.202 13:33:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:29.580 13:33:17 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:29.580 00:07:29.580 real 0m1.583s 00:07:29.580 user 0m1.332s 00:07:29.580 sys 0m0.248s 00:07:29.580 13:33:17 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:29.580 13:33:17 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:29.580 ************************************ 00:07:29.580 END TEST accel_fill 00:07:29.580 ************************************ 00:07:29.580 13:33:17 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:29.580 13:33:17 accel -- accel/accel.sh@119 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:29.580 13:33:17 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:29.580 13:33:17 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:29.580 13:33:17 accel -- common/autotest_common.sh@10 -- # set +x 00:07:29.580 ************************************ 00:07:29.580 START TEST accel_copy_crc32c 00:07:29.580 ************************************ 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@49 -- # local IFS=, 00:07:29.580 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@50 -- # jq -r . 00:07:29.580 [2024-07-12 13:33:18.048723] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:29.580 [2024-07-12 13:33:18.048790] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid399941 ] 00:07:29.840 [2024-07-12 13:33:18.178422] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.840 [2024-07-12 13:33:18.279253] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:29.840 13:33:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:31.217 00:07:31.217 real 0m1.517s 00:07:31.217 user 0m1.320s 00:07:31.217 sys 0m0.196s 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:31.217 13:33:19 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:31.217 ************************************ 00:07:31.217 END TEST accel_copy_crc32c 00:07:31.217 ************************************ 00:07:31.217 13:33:19 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:31.217 13:33:19 accel -- accel/accel.sh@120 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:31.217 13:33:19 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:31.217 13:33:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.217 13:33:19 accel -- common/autotest_common.sh@10 -- # set +x 00:07:31.217 ************************************ 00:07:31.217 START TEST accel_copy_crc32c_C2 00:07:31.217 ************************************ 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@49 -- # local IFS=, 00:07:31.217 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@50 -- # jq -r . 00:07:31.217 [2024-07-12 13:33:19.640502] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:31.217 [2024-07-12 13:33:19.640577] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid400247 ] 00:07:31.217 [2024-07-12 13:33:19.784618] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.476 [2024-07-12 13:33:19.890208] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:31.476 13:33:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:32.856 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:32.856 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:32.856 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:32.857 00:07:32.857 real 0m1.524s 00:07:32.857 user 0m1.323s 00:07:32.857 sys 0m0.207s 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:32.857 13:33:21 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:32.857 ************************************ 00:07:32.857 END TEST accel_copy_crc32c_C2 00:07:32.857 ************************************ 00:07:32.857 13:33:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:32.857 13:33:21 accel -- accel/accel.sh@121 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:32.857 13:33:21 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:32.857 13:33:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:32.857 13:33:21 accel -- common/autotest_common.sh@10 -- # set +x 00:07:32.857 ************************************ 00:07:32.857 START TEST accel_dualcast 00:07:32.857 ************************************ 00:07:32.857 13:33:21 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@49 -- # local IFS=, 00:07:32.857 13:33:21 accel.accel_dualcast -- accel/accel.sh@50 -- # jq -r . 00:07:32.857 [2024-07-12 13:33:21.254954] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:32.857 [2024-07-12 13:33:21.255019] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid400482 ] 00:07:32.857 [2024-07-12 13:33:21.387907] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.115 [2024-07-12 13:33:21.493104] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:33.115 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:33.116 13:33:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:34.491 13:33:22 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:34.491 00:07:34.491 real 0m1.513s 00:07:34.491 user 0m1.328s 00:07:34.491 sys 0m0.189s 00:07:34.491 13:33:22 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:34.491 13:33:22 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:34.491 ************************************ 00:07:34.491 END TEST accel_dualcast 00:07:34.491 ************************************ 00:07:34.491 13:33:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:34.491 13:33:22 accel -- accel/accel.sh@122 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:34.491 13:33:22 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:34.491 13:33:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.491 13:33:22 accel -- common/autotest_common.sh@10 -- # set +x 00:07:34.491 ************************************ 00:07:34.491 START TEST accel_compare 00:07:34.491 ************************************ 00:07:34.491 13:33:22 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@49 -- # local IFS=, 00:07:34.491 13:33:22 accel.accel_compare -- accel/accel.sh@50 -- # jq -r . 00:07:34.491 [2024-07-12 13:33:22.862137] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:34.492 [2024-07-12 13:33:22.862264] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid400682 ] 00:07:34.492 [2024-07-12 13:33:23.056215] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.750 [2024-07-12 13:33:23.161781] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:34.750 13:33:23 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:36.125 13:33:24 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:36.125 00:07:36.125 real 0m1.593s 00:07:36.125 user 0m1.328s 00:07:36.125 sys 0m0.268s 00:07:36.125 13:33:24 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:36.125 13:33:24 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:36.125 ************************************ 00:07:36.125 END TEST accel_compare 00:07:36.125 ************************************ 00:07:36.125 13:33:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:36.125 13:33:24 accel -- accel/accel.sh@123 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:36.125 13:33:24 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:36.125 13:33:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.125 13:33:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:36.125 ************************************ 00:07:36.125 START TEST accel_xor 00:07:36.125 ************************************ 00:07:36.125 13:33:24 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@49 -- # local IFS=, 00:07:36.125 13:33:24 accel.accel_xor -- accel/accel.sh@50 -- # jq -r . 00:07:36.125 [2024-07-12 13:33:24.528640] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:36.125 [2024-07-12 13:33:24.528705] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid400879 ] 00:07:36.125 [2024-07-12 13:33:24.641349] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.385 [2024-07-12 13:33:24.746649] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:36.385 13:33:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:37.763 13:33:25 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:37.763 00:07:37.763 real 0m1.492s 00:07:37.763 user 0m1.314s 00:07:37.763 sys 0m0.185s 00:07:37.763 13:33:25 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:37.763 13:33:25 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:37.763 ************************************ 00:07:37.763 END TEST accel_xor 00:07:37.763 ************************************ 00:07:37.763 13:33:26 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:37.763 13:33:26 accel -- accel/accel.sh@124 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:37.763 13:33:26 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:37.763 13:33:26 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:37.763 13:33:26 accel -- common/autotest_common.sh@10 -- # set +x 00:07:37.763 ************************************ 00:07:37.763 START TEST accel_xor 00:07:37.763 ************************************ 00:07:37.763 13:33:26 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@49 -- # local IFS=, 00:07:37.763 13:33:26 accel.accel_xor -- accel/accel.sh@50 -- # jq -r . 00:07:37.763 [2024-07-12 13:33:26.102507] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:37.763 [2024-07-12 13:33:26.102573] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid401081 ] 00:07:37.763 [2024-07-12 13:33:26.235347] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.763 [2024-07-12 13:33:26.340251] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:38.023 13:33:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:39.399 13:33:27 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:39.399 00:07:39.399 real 0m1.519s 00:07:39.399 user 0m1.321s 00:07:39.399 sys 0m0.201s 00:07:39.399 13:33:27 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:39.399 13:33:27 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:39.399 ************************************ 00:07:39.399 END TEST accel_xor 00:07:39.399 ************************************ 00:07:39.399 13:33:27 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:39.399 13:33:27 accel -- accel/accel.sh@125 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:39.399 13:33:27 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:39.399 13:33:27 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.399 13:33:27 accel -- common/autotest_common.sh@10 -- # set +x 00:07:39.399 ************************************ 00:07:39.399 START TEST accel_dif_verify 00:07:39.399 ************************************ 00:07:39.399 13:33:27 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@49 -- # local IFS=, 00:07:39.399 13:33:27 accel.accel_dif_verify -- accel/accel.sh@50 -- # jq -r . 00:07:39.399 [2024-07-12 13:33:27.697664] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:39.399 [2024-07-12 13:33:27.697725] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid401337 ] 00:07:39.399 [2024-07-12 13:33:27.828462] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.399 [2024-07-12 13:33:27.928846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.662 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:39.662 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.662 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.662 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.662 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:39.662 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.662 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.662 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.662 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:39.662 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:39.663 13:33:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:40.598 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:40.855 13:33:29 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:40.855 00:07:40.855 real 0m1.522s 00:07:40.855 user 0m1.322s 00:07:40.855 sys 0m0.197s 00:07:40.855 13:33:29 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:40.855 13:33:29 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:07:40.855 ************************************ 00:07:40.855 END TEST accel_dif_verify 00:07:40.855 ************************************ 00:07:40.855 13:33:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:40.855 13:33:29 accel -- accel/accel.sh@126 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:40.855 13:33:29 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:40.855 13:33:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.856 13:33:29 accel -- common/autotest_common.sh@10 -- # set +x 00:07:40.856 ************************************ 00:07:40.856 START TEST accel_dif_generate 00:07:40.856 ************************************ 00:07:40.856 13:33:29 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@49 -- # local IFS=, 00:07:40.856 13:33:29 accel.accel_dif_generate -- accel/accel.sh@50 -- # jq -r . 00:07:40.856 [2024-07-12 13:33:29.300229] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:40.856 [2024-07-12 13:33:29.300290] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid401632 ] 00:07:40.856 [2024-07-12 13:33:29.429895] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.113 [2024-07-12 13:33:29.530948] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.113 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:41.114 13:33:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:42.487 13:33:30 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:42.487 00:07:42.487 real 0m1.515s 00:07:42.487 user 0m1.328s 00:07:42.487 sys 0m0.193s 00:07:42.487 13:33:30 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:42.487 13:33:30 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:07:42.487 ************************************ 00:07:42.487 END TEST accel_dif_generate 00:07:42.487 ************************************ 00:07:42.487 13:33:30 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:42.487 13:33:30 accel -- accel/accel.sh@127 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:42.487 13:33:30 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:42.487 13:33:30 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:42.488 13:33:30 accel -- common/autotest_common.sh@10 -- # set +x 00:07:42.488 ************************************ 00:07:42.488 START TEST accel_dif_generate_copy 00:07:42.488 ************************************ 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@49 -- # local IFS=, 00:07:42.488 13:33:30 accel.accel_dif_generate_copy -- accel/accel.sh@50 -- # jq -r . 00:07:42.488 [2024-07-12 13:33:30.901179] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:42.488 [2024-07-12 13:33:30.901244] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid401823 ] 00:07:42.488 [2024-07-12 13:33:31.032097] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.746 [2024-07-12 13:33:31.137513] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.746 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:42.746 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.746 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.746 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:42.747 13:33:31 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:44.124 00:07:44.124 real 0m1.522s 00:07:44.124 user 0m1.322s 00:07:44.124 sys 0m0.200s 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.124 13:33:32 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:07:44.124 ************************************ 00:07:44.124 END TEST accel_dif_generate_copy 00:07:44.124 ************************************ 00:07:44.124 13:33:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:44.124 13:33:32 accel -- accel/accel.sh@129 -- # [[ y == y ]] 00:07:44.124 13:33:32 accel -- accel/accel.sh@130 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:44.124 13:33:32 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:44.124 13:33:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.124 13:33:32 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.124 ************************************ 00:07:44.124 START TEST accel_comp 00:07:44.124 ************************************ 00:07:44.124 13:33:32 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:44.124 13:33:32 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:44.124 13:33:32 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:07:44.124 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.124 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@49 -- # local IFS=, 00:07:44.125 13:33:32 accel.accel_comp -- accel/accel.sh@50 -- # jq -r . 00:07:44.125 [2024-07-12 13:33:32.506681] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:44.125 [2024-07-12 13:33:32.506746] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid402025 ] 00:07:44.125 [2024-07-12 13:33:32.639316] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.384 [2024-07-12 13:33:32.746369] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:44.384 13:33:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:45.761 13:33:33 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:45.761 00:07:45.761 real 0m1.522s 00:07:45.761 user 0m1.329s 00:07:45.761 sys 0m0.201s 00:07:45.761 13:33:33 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:45.761 13:33:33 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:07:45.761 ************************************ 00:07:45.761 END TEST accel_comp 00:07:45.761 ************************************ 00:07:45.761 13:33:34 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:45.761 13:33:34 accel -- accel/accel.sh@131 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:45.761 13:33:34 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:45.761 13:33:34 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:45.761 13:33:34 accel -- common/autotest_common.sh@10 -- # set +x 00:07:45.761 ************************************ 00:07:45.761 START TEST accel_decomp 00:07:45.761 ************************************ 00:07:45.761 13:33:34 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@49 -- # local IFS=, 00:07:45.761 13:33:34 accel.accel_decomp -- accel/accel.sh@50 -- # jq -r . 00:07:45.761 [2024-07-12 13:33:34.110498] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:45.761 [2024-07-12 13:33:34.110563] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid402218 ] 00:07:45.761 [2024-07-12 13:33:34.243869] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.020 [2024-07-12 13:33:34.349090] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:46.020 13:33:34 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:47.398 13:33:35 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:47.398 00:07:47.398 real 0m1.524s 00:07:47.398 user 0m1.320s 00:07:47.398 sys 0m0.212s 00:07:47.398 13:33:35 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.398 13:33:35 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:47.398 ************************************ 00:07:47.398 END TEST accel_decomp 00:07:47.398 ************************************ 00:07:47.398 13:33:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:47.398 13:33:35 accel -- accel/accel.sh@132 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:47.398 13:33:35 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:47.398 13:33:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.398 13:33:35 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.398 ************************************ 00:07:47.398 START TEST accel_decomp_full 00:07:47.398 ************************************ 00:07:47.398 13:33:35 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@49 -- # local IFS=, 00:07:47.398 13:33:35 accel.accel_decomp_full -- accel/accel.sh@50 -- # jq -r . 00:07:47.398 [2024-07-12 13:33:35.722288] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:47.398 [2024-07-12 13:33:35.722357] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid402422 ] 00:07:47.398 [2024-07-12 13:33:35.854268] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.398 [2024-07-12 13:33:35.962056] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.657 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:47.658 13:33:36 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:49.033 13:33:37 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:49.033 00:07:49.033 real 0m1.541s 00:07:49.033 user 0m1.351s 00:07:49.033 sys 0m0.193s 00:07:49.033 13:33:37 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:49.033 13:33:37 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:49.033 ************************************ 00:07:49.033 END TEST accel_decomp_full 00:07:49.033 ************************************ 00:07:49.033 13:33:37 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:49.033 13:33:37 accel -- accel/accel.sh@133 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:49.033 13:33:37 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:49.033 13:33:37 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:49.033 13:33:37 accel -- common/autotest_common.sh@10 -- # set +x 00:07:49.033 ************************************ 00:07:49.033 START TEST accel_decomp_mcore 00:07:49.033 ************************************ 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@49 -- # local IFS=, 00:07:49.033 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@50 -- # jq -r . 00:07:49.033 [2024-07-12 13:33:37.340022] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:49.033 [2024-07-12 13:33:37.340084] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid402734 ] 00:07:49.033 [2024-07-12 13:33:37.470365] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:49.033 [2024-07-12 13:33:37.574742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:49.033 [2024-07-12 13:33:37.574840] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:49.033 [2024-07-12 13:33:37.574967] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:49.033 [2024-07-12 13:33:37.574969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 13:33:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:50.668 00:07:50.668 real 0m1.515s 00:07:50.668 user 0m4.738s 00:07:50.668 sys 0m0.208s 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:50.668 13:33:38 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:50.668 ************************************ 00:07:50.668 END TEST accel_decomp_mcore 00:07:50.668 ************************************ 00:07:50.668 13:33:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:50.668 13:33:38 accel -- accel/accel.sh@134 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:50.668 13:33:38 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:50.668 13:33:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:50.668 13:33:38 accel -- common/autotest_common.sh@10 -- # set +x 00:07:50.668 ************************************ 00:07:50.668 START TEST accel_decomp_full_mcore 00:07:50.668 ************************************ 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@49 -- # local IFS=, 00:07:50.668 13:33:38 accel.accel_decomp_full_mcore -- accel/accel.sh@50 -- # jq -r . 00:07:50.668 [2024-07-12 13:33:38.944616] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:50.668 [2024-07-12 13:33:38.944677] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid402972 ] 00:07:50.668 [2024-07-12 13:33:39.074510] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:50.668 [2024-07-12 13:33:39.179703] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:50.668 [2024-07-12 13:33:39.179803] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:50.668 [2024-07-12 13:33:39.179903] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:50.668 [2024-07-12 13:33:39.179904] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.668 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.926 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:50.927 13:33:39 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.303 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.303 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:52.304 00:07:52.304 real 0m1.550s 00:07:52.304 user 0m4.871s 00:07:52.304 sys 0m0.209s 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:52.304 13:33:40 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:52.304 ************************************ 00:07:52.304 END TEST accel_decomp_full_mcore 00:07:52.304 ************************************ 00:07:52.304 13:33:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:52.304 13:33:40 accel -- accel/accel.sh@135 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:52.304 13:33:40 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:07:52.304 13:33:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:52.304 13:33:40 accel -- common/autotest_common.sh@10 -- # set +x 00:07:52.304 ************************************ 00:07:52.304 START TEST accel_decomp_mthread 00:07:52.304 ************************************ 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@49 -- # local IFS=, 00:07:52.304 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@50 -- # jq -r . 00:07:52.304 [2024-07-12 13:33:40.567368] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:52.304 [2024-07-12 13:33:40.567443] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid403175 ] 00:07:52.304 [2024-07-12 13:33:40.710486] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.304 [2024-07-12 13:33:40.814780] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:52.585 13:33:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:53.519 00:07:53.519 real 0m1.530s 00:07:53.519 user 0m1.327s 00:07:53.519 sys 0m0.211s 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:53.519 13:33:42 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:53.519 ************************************ 00:07:53.519 END TEST accel_decomp_mthread 00:07:53.519 ************************************ 00:07:53.778 13:33:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:53.778 13:33:42 accel -- accel/accel.sh@136 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:53.778 13:33:42 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:53.778 13:33:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:53.778 13:33:42 accel -- common/autotest_common.sh@10 -- # set +x 00:07:53.778 ************************************ 00:07:53.778 START TEST accel_decomp_full_mthread 00:07:53.778 ************************************ 00:07:53.778 13:33:42 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:53.778 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:53.778 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:53.778 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:53.778 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:53.778 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:53.778 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:53.778 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:53.779 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:53.779 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:53.779 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:53.779 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:53.779 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:53.779 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@45 -- # [[ -n '' ]] 00:07:53.779 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@49 -- # local IFS=, 00:07:53.779 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@50 -- # jq -r . 00:07:53.779 [2024-07-12 13:33:42.187346] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:53.779 [2024-07-12 13:33:42.187412] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid403369 ] 00:07:53.779 [2024-07-12 13:33:42.318469] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.038 [2024-07-12 13:33:42.424122] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:54.038 13:33:42 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:55.411 00:07:55.411 real 0m1.546s 00:07:55.411 user 0m1.348s 00:07:55.411 sys 0m0.205s 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:55.411 13:33:43 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:55.411 ************************************ 00:07:55.411 END TEST accel_decomp_full_mthread 00:07:55.411 ************************************ 00:07:55.411 13:33:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:55.411 13:33:43 accel -- accel/accel.sh@138 -- # [[ y == y ]] 00:07:55.411 13:33:43 accel -- accel/accel.sh@139 -- # COMPRESSDEV=1 00:07:55.411 13:33:43 accel -- accel/accel.sh@140 -- # get_expected_opcs 00:07:55.411 13:33:43 accel -- accel/accel.sh@69 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:55.411 13:33:43 accel -- accel/accel.sh@71 -- # spdk_tgt_pid=403569 00:07:55.411 13:33:43 accel -- accel/accel.sh@72 -- # waitforlisten 403569 00:07:55.411 13:33:43 accel -- accel/accel.sh@70 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:55.411 13:33:43 accel -- common/autotest_common.sh@829 -- # '[' -z 403569 ']' 00:07:55.411 13:33:43 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:55.411 13:33:43 accel -- accel/accel.sh@70 -- # build_accel_config 00:07:55.411 13:33:43 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:55.411 13:33:43 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:55.411 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:55.411 13:33:43 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:55.411 13:33:43 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:55.411 13:33:43 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:55.411 13:33:43 accel -- common/autotest_common.sh@10 -- # set +x 00:07:55.411 13:33:43 accel -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:55.411 13:33:43 accel -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:55.411 13:33:43 accel -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:55.411 13:33:43 accel -- accel/accel.sh@45 -- # [[ -n 1 ]] 00:07:55.411 13:33:43 accel -- accel/accel.sh@46 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:55.411 13:33:43 accel -- accel/accel.sh@49 -- # local IFS=, 00:07:55.411 13:33:43 accel -- accel/accel.sh@50 -- # jq -r . 00:07:55.411 [2024-07-12 13:33:43.813789] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:55.411 [2024-07-12 13:33:43.813874] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid403569 ] 00:07:55.411 [2024-07-12 13:33:43.958274] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.669 [2024-07-12 13:33:44.069516] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.605 [2024-07-12 13:33:44.828699] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:56.605 13:33:45 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:56.605 13:33:45 accel -- common/autotest_common.sh@862 -- # return 0 00:07:56.605 13:33:45 accel -- accel/accel.sh@74 -- # [[ 0 -gt 0 ]] 00:07:56.605 13:33:45 accel -- accel/accel.sh@77 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:56.605 13:33:45 accel -- accel/accel.sh@78 -- # [[ 0 -gt 0 ]] 00:07:56.605 13:33:45 accel -- accel/accel.sh@81 -- # [[ 0 -gt 0 ]] 00:07:56.605 13:33:45 accel -- accel/accel.sh@82 -- # [[ -n 1 ]] 00:07:56.605 13:33:45 accel -- accel/accel.sh@82 -- # check_save_config compressdev_scan_accel_module 00:07:56.605 13:33:45 accel -- accel/accel.sh@65 -- # rpc_cmd save_config 00:07:56.605 13:33:45 accel -- accel/accel.sh@65 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:07:56.605 13:33:45 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:56.605 13:33:45 accel -- common/autotest_common.sh@10 -- # set +x 00:07:56.605 13:33:45 accel -- accel/accel.sh@65 -- # grep compressdev_scan_accel_module 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:56.864 "method": "compressdev_scan_accel_module", 00:07:56.864 13:33:45 accel -- accel/accel.sh@84 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:56.864 13:33:45 accel -- accel/accel.sh@84 -- # rpc_cmd accel_get_opc_assignments 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:56.864 13:33:45 accel -- accel/accel.sh@84 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@10 -- # set +x 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@85 -- # for opc_opt in "${exp_opcs[@]}" 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # IFS== 00:07:56.864 13:33:45 accel -- accel/accel.sh@86 -- # read -r opc module 00:07:56.864 13:33:45 accel -- accel/accel.sh@87 -- # expected_opcs["$opc"]=software 00:07:56.864 13:33:45 accel -- accel/accel.sh@89 -- # killprocess 403569 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@948 -- # '[' -z 403569 ']' 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@952 -- # kill -0 403569 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@953 -- # uname 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 403569 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 403569' 00:07:56.864 killing process with pid 403569 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@967 -- # kill 403569 00:07:56.864 13:33:45 accel -- common/autotest_common.sh@972 -- # wait 403569 00:07:57.431 13:33:45 accel -- accel/accel.sh@90 -- # trap - ERR 00:07:57.431 13:33:45 accel -- accel/accel.sh@141 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:57.431 13:33:45 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:57.431 13:33:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.431 13:33:45 accel -- common/autotest_common.sh@10 -- # set +x 00:07:57.431 ************************************ 00:07:57.431 START TEST accel_cdev_comp 00:07:57.431 ************************************ 00:07:57.431 13:33:45 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@45 -- # [[ -n 1 ]] 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@46 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@49 -- # local IFS=, 00:07:57.431 13:33:45 accel.accel_cdev_comp -- accel/accel.sh@50 -- # jq -r . 00:07:57.431 [2024-07-12 13:33:45.781389] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:57.431 [2024-07-12 13:33:45.781451] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid403926 ] 00:07:57.431 [2024-07-12 13:33:45.910700] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.431 [2024-07-12 13:33:46.010613] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.366 [2024-07-12 13:33:46.783872] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:58.366 [2024-07-12 13:33:46.786477] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2099a00 PMD being used: compress_qat 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 [2024-07-12 13:33:46.790566] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x209e740 PMD being used: compress_qat 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.366 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:58.367 13:33:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:59.743 13:33:47 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:59.743 00:07:59.743 real 0m2.226s 00:07:59.743 user 0m1.640s 00:07:59.743 sys 0m0.583s 00:07:59.743 13:33:47 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:59.743 13:33:47 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:07:59.743 ************************************ 00:07:59.743 END TEST accel_cdev_comp 00:07:59.743 ************************************ 00:07:59.743 13:33:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:59.743 13:33:48 accel -- accel/accel.sh@142 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:59.743 13:33:48 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:59.743 13:33:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:59.743 13:33:48 accel -- common/autotest_common.sh@10 -- # set +x 00:07:59.743 ************************************ 00:07:59.743 START TEST accel_cdev_decomp 00:07:59.743 ************************************ 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:59.743 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:07:59.744 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:07:59.744 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:07:59.744 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@45 -- # [[ -n 1 ]] 00:07:59.744 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@46 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:59.744 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@49 -- # local IFS=, 00:07:59.744 13:33:48 accel.accel_cdev_decomp -- accel/accel.sh@50 -- # jq -r . 00:07:59.744 [2024-07-12 13:33:48.081176] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:07:59.744 [2024-07-12 13:33:48.081236] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid404216 ] 00:07:59.744 [2024-07-12 13:33:48.209998] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.744 [2024-07-12 13:33:48.310030] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.680 [2024-07-12 13:33:49.096735] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:00.680 [2024-07-12 13:33:49.099361] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23a7a00 PMD being used: compress_qat 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.680 [2024-07-12 13:33:49.103532] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23ac740 PMD being used: compress_qat 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.680 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:00.681 13:33:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:02.059 00:08:02.059 real 0m2.229s 00:08:02.059 user 0m1.628s 00:08:02.059 sys 0m0.602s 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.059 13:33:50 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:02.059 ************************************ 00:08:02.059 END TEST accel_cdev_decomp 00:08:02.059 ************************************ 00:08:02.059 13:33:50 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:02.059 13:33:50 accel -- accel/accel.sh@143 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:02.059 13:33:50 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:02.059 13:33:50 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.059 13:33:50 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.059 ************************************ 00:08:02.059 START TEST accel_cdev_decomp_full 00:08:02.059 ************************************ 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@45 -- # [[ -n 1 ]] 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@46 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@49 -- # local IFS=, 00:08:02.059 13:33:50 accel.accel_cdev_decomp_full -- accel/accel.sh@50 -- # jq -r . 00:08:02.059 [2024-07-12 13:33:50.401360] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:02.059 [2024-07-12 13:33:50.401485] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid404497 ] 00:08:02.059 [2024-07-12 13:33:50.598054] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.318 [2024-07-12 13:33:50.705708] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.885 [2024-07-12 13:33:51.466805] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:03.150 [2024-07-12 13:33:51.469394] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10b5a00 PMD being used: compress_qat 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 [2024-07-12 13:33:51.472913] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10b8cd0 PMD being used: compress_qat 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:03.150 13:33:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:04.090 00:08:04.090 real 0m2.288s 00:08:04.090 user 0m1.652s 00:08:04.090 sys 0m0.632s 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:04.090 13:33:52 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:04.090 ************************************ 00:08:04.090 END TEST accel_cdev_decomp_full 00:08:04.090 ************************************ 00:08:04.348 13:33:52 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:04.348 13:33:52 accel -- accel/accel.sh@144 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:04.348 13:33:52 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:04.348 13:33:52 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:04.348 13:33:52 accel -- common/autotest_common.sh@10 -- # set +x 00:08:04.348 ************************************ 00:08:04.348 START TEST accel_cdev_decomp_mcore 00:08:04.348 ************************************ 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:08:04.348 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@45 -- # [[ -n 1 ]] 00:08:04.349 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@46 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:04.349 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@49 -- # local IFS=, 00:08:04.349 13:33:52 accel.accel_cdev_decomp_mcore -- accel/accel.sh@50 -- # jq -r . 00:08:04.349 [2024-07-12 13:33:52.762161] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:04.349 [2024-07-12 13:33:52.762230] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid404867 ] 00:08:04.349 [2024-07-12 13:33:52.893887] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:04.608 [2024-07-12 13:33:53.002630] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:04.608 [2024-07-12 13:33:53.002734] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:04.608 [2024-07-12 13:33:53.002838] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:04.608 [2024-07-12 13:33:53.002839] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.544 [2024-07-12 13:33:53.759899] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:05.544 [2024-07-12 13:33:53.762431] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x14c8040 PMD being used: compress_qat 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:05.544 [2024-07-12 13:33:53.768181] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1e2419b8b0 PMD being used: compress_qat 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 [2024-07-12 13:33:53.770173] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x14cd3c0 PMD being used: compress_qat 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.544 [2024-07-12 13:33:53.773960] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1e1c19b8b0 PMD being used: compress_qat 00:08:05.544 [2024-07-12 13:33:53.774221] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1e1419b8b0 PMD being used: compress_qat 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:05.544 13:33:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.480 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:06.481 00:08:06.481 real 0m2.249s 00:08:06.481 user 0m7.234s 00:08:06.481 sys 0m0.599s 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:06.481 13:33:54 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:06.481 ************************************ 00:08:06.481 END TEST accel_cdev_decomp_mcore 00:08:06.481 ************************************ 00:08:06.481 13:33:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:06.481 13:33:55 accel -- accel/accel.sh@145 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:06.481 13:33:55 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:06.481 13:33:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:06.481 13:33:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:06.481 ************************************ 00:08:06.481 START TEST accel_cdev_decomp_full_mcore 00:08:06.481 ************************************ 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@45 -- # [[ -n 1 ]] 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@46 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@49 -- # local IFS=, 00:08:06.481 13:33:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@50 -- # jq -r . 00:08:06.739 [2024-07-12 13:33:55.088500] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:06.739 [2024-07-12 13:33:55.088559] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid405236 ] 00:08:06.739 [2024-07-12 13:33:55.217874] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:06.739 [2024-07-12 13:33:55.320588] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:06.739 [2024-07-12 13:33:55.320690] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:06.739 [2024-07-12 13:33:55.320718] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.739 [2024-07-12 13:33:55.320717] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:07.675 [2024-07-12 13:33:56.079132] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:07.675 [2024-07-12 13:33:56.081755] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xb24040 PMD being used: compress_qat 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:07.675 [2024-07-12 13:33:56.086852] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f056819b8b0 PMD being used: compress_qat 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:07.675 [2024-07-12 13:33:56.088882] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xb240e0 PMD being used: compress_qat 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:07.675 [2024-07-12 13:33:56.092328] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f056019b8b0 PMD being used: compress_qat 00:08:07.675 [2024-07-12 13:33:56.092661] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f055819b8b0 PMD being used: compress_qat 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.675 13:33:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:09.054 00:08:09.054 real 0m2.240s 00:08:09.054 user 0m7.236s 00:08:09.054 sys 0m0.602s 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:09.054 13:33:57 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:09.054 ************************************ 00:08:09.054 END TEST accel_cdev_decomp_full_mcore 00:08:09.054 ************************************ 00:08:09.054 13:33:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:09.054 13:33:57 accel -- accel/accel.sh@146 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:09.054 13:33:57 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:09.054 13:33:57 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:09.054 13:33:57 accel -- common/autotest_common.sh@10 -- # set +x 00:08:09.054 ************************************ 00:08:09.054 START TEST accel_cdev_decomp_mthread 00:08:09.054 ************************************ 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@45 -- # [[ -n 1 ]] 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@46 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@49 -- # local IFS=, 00:08:09.054 13:33:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@50 -- # jq -r . 00:08:09.054 [2024-07-12 13:33:57.407980] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:09.054 [2024-07-12 13:33:57.408041] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid405442 ] 00:08:09.054 [2024-07-12 13:33:57.539246] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.313 [2024-07-12 13:33:57.640014] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.881 [2024-07-12 13:33:58.388498] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:09.881 [2024-07-12 13:33:58.391100] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16c6a00 PMD being used: compress_qat 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.881 [2024-07-12 13:33:58.396100] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16cbb40 PMD being used: compress_qat 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.881 [2024-07-12 13:33:58.398708] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17ee970 PMD being used: compress_qat 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.881 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:09.882 13:33:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:11.260 00:08:11.260 real 0m2.210s 00:08:11.260 user 0m1.657s 00:08:11.260 sys 0m0.560s 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:11.260 13:33:59 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:11.260 ************************************ 00:08:11.260 END TEST accel_cdev_decomp_mthread 00:08:11.260 ************************************ 00:08:11.260 13:33:59 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:11.260 13:33:59 accel -- accel/accel.sh@147 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.260 13:33:59 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:11.260 13:33:59 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.260 13:33:59 accel -- common/autotest_common.sh@10 -- # set +x 00:08:11.260 ************************************ 00:08:11.260 START TEST accel_cdev_decomp_full_mthread 00:08:11.260 ************************************ 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@45 -- # [[ -n 1 ]] 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@46 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@49 -- # local IFS=, 00:08:11.260 13:33:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@50 -- # jq -r . 00:08:11.260 [2024-07-12 13:33:59.703729] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:11.260 [2024-07-12 13:33:59.703854] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid405805 ] 00:08:11.519 [2024-07-12 13:33:59.900406] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.519 [2024-07-12 13:34:00.005503] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.457 [2024-07-12 13:34:00.775163] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:12.457 [2024-07-12 13:34:00.777774] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dbda00 PMD being used: compress_qat 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.457 [2024-07-12 13:34:00.781945] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dc0cd0 PMD being used: compress_qat 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 [2024-07-12 13:34:00.784718] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ee95d0 PMD being used: compress_qat 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.457 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.458 13:34:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:13.395 00:08:13.395 real 0m2.315s 00:08:13.395 user 0m1.686s 00:08:13.395 sys 0m0.626s 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:13.395 13:34:01 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:13.395 ************************************ 00:08:13.395 END TEST accel_cdev_decomp_full_mthread 00:08:13.395 ************************************ 00:08:13.654 13:34:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:13.654 13:34:02 accel -- accel/accel.sh@148 -- # unset COMPRESSDEV 00:08:13.654 13:34:02 accel -- accel/accel.sh@150 -- # [[ 0 == 1 ]] 00:08:13.654 13:34:02 accel -- accel/accel.sh@177 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:13.654 13:34:02 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:13.654 13:34:02 accel -- accel/accel.sh@177 -- # build_accel_config 00:08:13.654 13:34:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.654 13:34:02 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.654 13:34:02 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.654 13:34:02 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.654 13:34:02 accel -- accel/accel.sh@40 -- # [[ '' != \k\e\r\n\e\l ]] 00:08:13.654 13:34:02 accel -- accel/accel.sh@41 -- # [[ 0 -gt 0 ]] 00:08:13.654 13:34:02 accel -- accel/accel.sh@43 -- # [[ 0 -gt 0 ]] 00:08:13.654 13:34:02 accel -- accel/accel.sh@45 -- # [[ -n '' ]] 00:08:13.654 13:34:02 accel -- accel/accel.sh@49 -- # local IFS=, 00:08:13.654 13:34:02 accel -- accel/accel.sh@50 -- # jq -r . 00:08:13.654 ************************************ 00:08:13.654 START TEST accel_dif_functional_tests 00:08:13.654 ************************************ 00:08:13.654 13:34:02 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:13.654 [2024-07-12 13:34:02.118371] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:13.654 [2024-07-12 13:34:02.118442] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid406190 ] 00:08:13.913 [2024-07-12 13:34:02.252186] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:13.913 [2024-07-12 13:34:02.359175] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:13.913 [2024-07-12 13:34:02.359276] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:13.913 [2024-07-12 13:34:02.359278] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.913 00:08:13.913 00:08:13.913 CUnit - A unit testing framework for C - Version 2.1-3 00:08:13.913 http://cunit.sourceforge.net/ 00:08:13.913 00:08:13.913 00:08:13.913 Suite: accel_dif 00:08:13.913 Test: verify: DIF generated, GUARD check ...passed 00:08:13.913 Test: verify: DIF generated, APPTAG check ...passed 00:08:13.913 Test: verify: DIF generated, REFTAG check ...passed 00:08:13.913 Test: verify: DIF not generated, GUARD check ...[2024-07-12 13:34:02.455492] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:13.913 passed 00:08:13.913 Test: verify: DIF not generated, APPTAG check ...[2024-07-12 13:34:02.455570] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:13.913 passed 00:08:13.914 Test: verify: DIF not generated, REFTAG check ...[2024-07-12 13:34:02.455607] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:13.914 passed 00:08:13.914 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:13.914 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-12 13:34:02.455685] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:13.914 passed 00:08:13.914 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:13.914 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:13.914 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:13.914 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-12 13:34:02.455848] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:13.914 passed 00:08:13.914 Test: verify copy: DIF generated, GUARD check ...passed 00:08:13.914 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:13.914 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:13.914 Test: verify copy: DIF not generated, GUARD check ...[2024-07-12 13:34:02.456035] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:13.914 passed 00:08:13.914 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-12 13:34:02.456077] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:13.914 passed 00:08:13.914 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-12 13:34:02.456126] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:13.914 passed 00:08:13.914 Test: generate copy: DIF generated, GUARD check ...passed 00:08:13.914 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:13.914 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:13.914 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:13.914 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:13.914 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:13.914 Test: generate copy: iovecs-len validate ...[2024-07-12 13:34:02.456399] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:13.914 passed 00:08:13.914 Test: generate copy: buffer alignment validate ...passed 00:08:13.914 00:08:13.914 Run Summary: Type Total Ran Passed Failed Inactive 00:08:13.914 suites 1 1 n/a 0 0 00:08:13.914 tests 26 26 26 0 0 00:08:13.914 asserts 115 115 115 0 n/a 00:08:13.914 00:08:13.914 Elapsed time = 0.003 seconds 00:08:14.173 00:08:14.173 real 0m0.624s 00:08:14.173 user 0m0.818s 00:08:14.173 sys 0m0.250s 00:08:14.173 13:34:02 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:14.173 13:34:02 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:14.173 ************************************ 00:08:14.173 END TEST accel_dif_functional_tests 00:08:14.173 ************************************ 00:08:14.173 13:34:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:14.173 13:34:02 accel -- accel/accel.sh@178 -- # export PCI_ALLOWED= 00:08:14.173 13:34:02 accel -- accel/accel.sh@178 -- # PCI_ALLOWED= 00:08:14.173 00:08:14.173 real 0m54.380s 00:08:14.173 user 1m2.244s 00:08:14.173 sys 0m12.447s 00:08:14.173 13:34:02 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:14.173 13:34:02 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.173 ************************************ 00:08:14.173 END TEST accel 00:08:14.173 ************************************ 00:08:14.431 13:34:02 -- common/autotest_common.sh@1142 -- # return 0 00:08:14.431 13:34:02 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:14.431 13:34:02 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:14.431 13:34:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.431 13:34:02 -- common/autotest_common.sh@10 -- # set +x 00:08:14.431 ************************************ 00:08:14.431 START TEST accel_rpc 00:08:14.431 ************************************ 00:08:14.431 13:34:02 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:14.431 * Looking for test storage... 00:08:14.431 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:14.431 13:34:02 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:14.431 13:34:02 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=406323 00:08:14.431 13:34:02 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 406323 00:08:14.431 13:34:02 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:14.431 13:34:02 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 406323 ']' 00:08:14.431 13:34:02 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:14.431 13:34:02 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:14.431 13:34:02 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:14.431 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:14.431 13:34:02 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:14.431 13:34:02 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:14.689 [2024-07-12 13:34:03.046215] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:14.689 [2024-07-12 13:34:03.046357] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid406323 ] 00:08:14.689 [2024-07-12 13:34:03.242021] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.947 [2024-07-12 13:34:03.343626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.515 13:34:03 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:15.515 13:34:03 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:15.515 13:34:03 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:15.515 13:34:03 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:15.515 13:34:03 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:15.515 13:34:03 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:15.515 13:34:03 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:15.515 13:34:03 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:15.515 13:34:03 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:15.515 13:34:03 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:15.515 ************************************ 00:08:15.515 START TEST accel_assign_opcode 00:08:15.515 ************************************ 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:15.515 [2024-07-12 13:34:03.953641] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:15.515 [2024-07-12 13:34:03.961653] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:15.515 13:34:03 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:15.774 13:34:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:15.775 13:34:04 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:15.775 13:34:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:15.775 13:34:04 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:15.775 13:34:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:15.775 13:34:04 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:15.775 13:34:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:15.775 software 00:08:15.775 00:08:15.775 real 0m0.279s 00:08:15.775 user 0m0.051s 00:08:15.775 sys 0m0.012s 00:08:15.775 13:34:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:15.775 13:34:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:15.775 ************************************ 00:08:15.775 END TEST accel_assign_opcode 00:08:15.775 ************************************ 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:15.775 13:34:04 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 406323 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 406323 ']' 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 406323 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 406323 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 406323' 00:08:15.775 killing process with pid 406323 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@967 -- # kill 406323 00:08:15.775 13:34:04 accel_rpc -- common/autotest_common.sh@972 -- # wait 406323 00:08:16.343 00:08:16.343 real 0m1.900s 00:08:16.343 user 0m1.907s 00:08:16.343 sys 0m0.649s 00:08:16.343 13:34:04 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:16.343 13:34:04 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:16.343 ************************************ 00:08:16.343 END TEST accel_rpc 00:08:16.343 ************************************ 00:08:16.343 13:34:04 -- common/autotest_common.sh@1142 -- # return 0 00:08:16.343 13:34:04 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:16.343 13:34:04 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:16.343 13:34:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.343 13:34:04 -- common/autotest_common.sh@10 -- # set +x 00:08:16.343 ************************************ 00:08:16.343 START TEST app_cmdline 00:08:16.343 ************************************ 00:08:16.343 13:34:04 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:16.343 * Looking for test storage... 00:08:16.343 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:16.343 13:34:04 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:16.343 13:34:04 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=406680 00:08:16.343 13:34:04 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:16.343 13:34:04 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 406680 00:08:16.343 13:34:04 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 406680 ']' 00:08:16.343 13:34:04 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:16.343 13:34:04 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:16.343 13:34:04 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:16.343 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:16.343 13:34:04 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:16.343 13:34:04 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:16.602 [2024-07-12 13:34:04.981221] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:16.602 [2024-07-12 13:34:04.981295] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid406680 ] 00:08:16.602 [2024-07-12 13:34:05.110488] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.861 [2024-07-12 13:34:05.214729] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.428 13:34:05 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:17.428 13:34:05 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:17.428 13:34:05 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:17.687 { 00:08:17.687 "version": "SPDK v24.09-pre git sha1 a49cd26ae", 00:08:17.687 "fields": { 00:08:17.687 "major": 24, 00:08:17.687 "minor": 9, 00:08:17.687 "patch": 0, 00:08:17.687 "suffix": "-pre", 00:08:17.687 "commit": "a49cd26ae" 00:08:17.687 } 00:08:17.687 } 00:08:17.687 13:34:06 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:17.687 13:34:06 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:17.687 13:34:06 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:17.687 13:34:06 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:17.687 13:34:06 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:17.687 13:34:06 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:17.687 13:34:06 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:17.687 13:34:06 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:17.687 13:34:06 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:17.687 13:34:06 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:17.687 13:34:06 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:17.687 13:34:06 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:17.688 13:34:06 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:17.688 13:34:06 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:17.946 request: 00:08:17.946 { 00:08:17.946 "method": "env_dpdk_get_mem_stats", 00:08:17.946 "req_id": 1 00:08:17.946 } 00:08:17.946 Got JSON-RPC error response 00:08:17.946 response: 00:08:17.946 { 00:08:17.946 "code": -32601, 00:08:17.946 "message": "Method not found" 00:08:17.946 } 00:08:17.946 13:34:06 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:17.946 13:34:06 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:17.946 13:34:06 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:17.946 13:34:06 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:17.946 13:34:06 app_cmdline -- app/cmdline.sh@1 -- # killprocess 406680 00:08:17.946 13:34:06 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 406680 ']' 00:08:17.946 13:34:06 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 406680 00:08:17.946 13:34:06 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:17.946 13:34:06 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:17.946 13:34:06 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 406680 00:08:17.946 13:34:06 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:17.947 13:34:06 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:17.947 13:34:06 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 406680' 00:08:17.947 killing process with pid 406680 00:08:17.947 13:34:06 app_cmdline -- common/autotest_common.sh@967 -- # kill 406680 00:08:17.947 13:34:06 app_cmdline -- common/autotest_common.sh@972 -- # wait 406680 00:08:18.206 00:08:18.206 real 0m1.985s 00:08:18.206 user 0m2.354s 00:08:18.206 sys 0m0.611s 00:08:18.206 13:34:06 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:18.206 13:34:06 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:18.206 ************************************ 00:08:18.206 END TEST app_cmdline 00:08:18.206 ************************************ 00:08:18.465 13:34:06 -- common/autotest_common.sh@1142 -- # return 0 00:08:18.465 13:34:06 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:18.465 13:34:06 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:18.465 13:34:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:18.465 13:34:06 -- common/autotest_common.sh@10 -- # set +x 00:08:18.465 ************************************ 00:08:18.465 START TEST version 00:08:18.465 ************************************ 00:08:18.465 13:34:06 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:18.465 * Looking for test storage... 00:08:18.465 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:18.465 13:34:06 version -- app/version.sh@17 -- # get_header_version major 00:08:18.465 13:34:06 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:18.465 13:34:06 version -- app/version.sh@14 -- # cut -f2 00:08:18.465 13:34:06 version -- app/version.sh@14 -- # tr -d '"' 00:08:18.465 13:34:06 version -- app/version.sh@17 -- # major=24 00:08:18.465 13:34:06 version -- app/version.sh@18 -- # get_header_version minor 00:08:18.465 13:34:06 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:18.465 13:34:06 version -- app/version.sh@14 -- # cut -f2 00:08:18.465 13:34:06 version -- app/version.sh@14 -- # tr -d '"' 00:08:18.465 13:34:06 version -- app/version.sh@18 -- # minor=9 00:08:18.465 13:34:07 version -- app/version.sh@19 -- # get_header_version patch 00:08:18.465 13:34:07 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:18.465 13:34:07 version -- app/version.sh@14 -- # cut -f2 00:08:18.465 13:34:07 version -- app/version.sh@14 -- # tr -d '"' 00:08:18.465 13:34:07 version -- app/version.sh@19 -- # patch=0 00:08:18.465 13:34:07 version -- app/version.sh@20 -- # get_header_version suffix 00:08:18.465 13:34:07 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:18.465 13:34:07 version -- app/version.sh@14 -- # cut -f2 00:08:18.465 13:34:07 version -- app/version.sh@14 -- # tr -d '"' 00:08:18.465 13:34:07 version -- app/version.sh@20 -- # suffix=-pre 00:08:18.465 13:34:07 version -- app/version.sh@22 -- # version=24.9 00:08:18.465 13:34:07 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:18.465 13:34:07 version -- app/version.sh@28 -- # version=24.9rc0 00:08:18.465 13:34:07 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:18.465 13:34:07 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:18.724 13:34:07 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:18.724 13:34:07 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:18.724 00:08:18.724 real 0m0.190s 00:08:18.724 user 0m0.088s 00:08:18.724 sys 0m0.151s 00:08:18.724 13:34:07 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:18.724 13:34:07 version -- common/autotest_common.sh@10 -- # set +x 00:08:18.724 ************************************ 00:08:18.724 END TEST version 00:08:18.724 ************************************ 00:08:18.724 13:34:07 -- common/autotest_common.sh@1142 -- # return 0 00:08:18.724 13:34:07 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:18.724 13:34:07 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:18.724 13:34:07 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:18.724 13:34:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:18.724 13:34:07 -- common/autotest_common.sh@10 -- # set +x 00:08:18.724 ************************************ 00:08:18.724 START TEST blockdev_general 00:08:18.724 ************************************ 00:08:18.724 13:34:07 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:18.724 * Looking for test storage... 00:08:18.724 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:18.724 13:34:07 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:18.724 13:34:07 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:18.724 13:34:07 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:18.724 13:34:07 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:18.724 13:34:07 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:18.724 13:34:07 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:18.724 13:34:07 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=407148 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 407148 00:08:18.725 13:34:07 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 407148 ']' 00:08:18.725 13:34:07 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:18.725 13:34:07 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:18.725 13:34:07 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:18.725 13:34:07 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:18.725 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:18.725 13:34:07 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:18.725 13:34:07 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:18.983 [2024-07-12 13:34:07.334033] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:18.983 [2024-07-12 13:34:07.334110] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid407148 ] 00:08:18.983 [2024-07-12 13:34:07.464868] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.242 [2024-07-12 13:34:07.570893] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.809 13:34:08 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:19.809 13:34:08 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:08:19.809 13:34:08 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:08:19.809 13:34:08 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:08:19.809 13:34:08 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:08:19.809 13:34:08 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:19.809 13:34:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:20.068 [2024-07-12 13:34:08.564835] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:20.068 [2024-07-12 13:34:08.564891] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:20.068 00:08:20.068 [2024-07-12 13:34:08.572817] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:20.068 [2024-07-12 13:34:08.572843] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:20.068 00:08:20.068 Malloc0 00:08:20.068 Malloc1 00:08:20.068 Malloc2 00:08:20.068 Malloc3 00:08:20.068 Malloc4 00:08:20.326 Malloc5 00:08:20.326 Malloc6 00:08:20.326 Malloc7 00:08:20.326 Malloc8 00:08:20.326 Malloc9 00:08:20.326 [2024-07-12 13:34:08.722092] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:20.326 [2024-07-12 13:34:08.722137] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:20.326 [2024-07-12 13:34:08.722157] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10473c0 00:08:20.326 [2024-07-12 13:34:08.722170] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:20.326 [2024-07-12 13:34:08.723517] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:20.326 [2024-07-12 13:34:08.723545] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:20.326 TestPT 00:08:20.326 13:34:08 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:20.326 13:34:08 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:08:20.326 5000+0 records in 00:08:20.326 5000+0 records out 00:08:20.326 10240000 bytes (10 MB, 9.8 MiB) copied, 0.024887 s, 411 MB/s 00:08:20.326 13:34:08 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:08:20.326 13:34:08 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:20.326 13:34:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:20.326 AIO0 00:08:20.326 13:34:08 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:20.326 13:34:08 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:08:20.326 13:34:08 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:20.326 13:34:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:20.326 13:34:08 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:20.326 13:34:08 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:08:20.326 13:34:08 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:08:20.326 13:34:08 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:20.327 13:34:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:20.327 13:34:08 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:20.327 13:34:08 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:08:20.327 13:34:08 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:20.327 13:34:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:20.327 13:34:08 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:20.327 13:34:08 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:08:20.327 13:34:08 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:20.327 13:34:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:20.585 13:34:08 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:20.585 13:34:08 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:08:20.585 13:34:08 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:08:20.585 13:34:08 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:08:20.585 13:34:08 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:20.585 13:34:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:20.585 13:34:09 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:20.585 13:34:09 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:08:20.585 13:34:09 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:08:20.587 13:34:09 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b54295b6-dfcd-4f0c-bcec-95fc3f9789dd"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b54295b6-dfcd-4f0c-bcec-95fc3f9789dd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "16db39df-b484-59f3-892f-38fcbfdf91d4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "16db39df-b484-59f3-892f-38fcbfdf91d4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "e9bdf4d1-adc3-5abc-8d90-916baecd391a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e9bdf4d1-adc3-5abc-8d90-916baecd391a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "d5978583-f4ee-5a7a-82e0-668a13a80b62"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d5978583-f4ee-5a7a-82e0-668a13a80b62",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "602aac2f-9901-5a87-ae3e-d1e856e8cbe5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "602aac2f-9901-5a87-ae3e-d1e856e8cbe5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "eca3a967-15ab-5017-84b3-9f4fd5d8a194"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "eca3a967-15ab-5017-84b3-9f4fd5d8a194",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "3fa35bdb-c241-52ad-9e8b-cf39e99be0e0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3fa35bdb-c241-52ad-9e8b-cf39e99be0e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "47519d7d-3387-58d9-87bc-00fe1479d37f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "47519d7d-3387-58d9-87bc-00fe1479d37f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "46978a68-410e-5a69-86f9-c63d78064cd3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "46978a68-410e-5a69-86f9-c63d78064cd3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "fa65e3d7-a815-5b29-91be-a747ab168765"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fa65e3d7-a815-5b29-91be-a747ab168765",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "2c4fbf9a-94b6-5921-bdf5-d3fb7fa20e98"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2c4fbf9a-94b6-5921-bdf5-d3fb7fa20e98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "8a872f31-e213-59da-ad5c-4bbc641cf23f"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8a872f31-e213-59da-ad5c-4bbc641cf23f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "887a9296-4a15-4ae1-bfe4-a25bfd1ba94d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "887a9296-4a15-4ae1-bfe4-a25bfd1ba94d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "887a9296-4a15-4ae1-bfe4-a25bfd1ba94d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "82f065d8-cba3-44be-95f5-36b3d63dbfc4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "a3677b05-119b-4a6d-a3b6-08fe1271fab9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "f6a2fd49-906b-41f2-8a76-b7f9d08dfd67"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f6a2fd49-906b-41f2-8a76-b7f9d08dfd67",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f6a2fd49-906b-41f2-8a76-b7f9d08dfd67",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "0b590ef9-e259-4325-a063-cc78c88dbec2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "134cde33-6b1b-49c6-99c6-d49831308134",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "0a286f5e-98f2-429f-b91c-d126e3ce2d3c"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0a286f5e-98f2-429f-b91c-d126e3ce2d3c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0a286f5e-98f2-429f-b91c-d126e3ce2d3c",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "e03d49df-bd3f-4144-a6bd-091b43bd4889",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "23778c45-1042-463e-be30-84bdde1bd4df",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "18ad1597-9b1d-4116-aa64-74f795158054"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "18ad1597-9b1d-4116-aa64-74f795158054",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:20.845 13:34:09 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:08:20.846 13:34:09 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:08:20.846 13:34:09 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:08:20.846 13:34:09 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 407148 00:08:20.846 13:34:09 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 407148 ']' 00:08:20.846 13:34:09 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 407148 00:08:20.846 13:34:09 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:08:20.846 13:34:09 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:20.846 13:34:09 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 407148 00:08:20.846 13:34:09 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:20.846 13:34:09 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:20.846 13:34:09 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 407148' 00:08:20.846 killing process with pid 407148 00:08:20.846 13:34:09 blockdev_general -- common/autotest_common.sh@967 -- # kill 407148 00:08:20.846 13:34:09 blockdev_general -- common/autotest_common.sh@972 -- # wait 407148 00:08:21.414 13:34:09 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:21.414 13:34:09 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:21.414 13:34:09 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:21.414 13:34:09 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:21.414 13:34:09 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:21.414 ************************************ 00:08:21.414 START TEST bdev_hello_world 00:08:21.414 ************************************ 00:08:21.414 13:34:09 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:21.414 [2024-07-12 13:34:09.863734] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:21.414 [2024-07-12 13:34:09.863809] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid407445 ] 00:08:21.673 [2024-07-12 13:34:10.008090] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.673 [2024-07-12 13:34:10.122494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.932 [2024-07-12 13:34:10.280551] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:21.932 [2024-07-12 13:34:10.280602] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:21.932 [2024-07-12 13:34:10.280617] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:21.932 [2024-07-12 13:34:10.288559] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:21.932 [2024-07-12 13:34:10.288586] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:21.932 [2024-07-12 13:34:10.296569] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:21.932 [2024-07-12 13:34:10.296593] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:21.932 [2024-07-12 13:34:10.369709] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:21.932 [2024-07-12 13:34:10.369759] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:21.932 [2024-07-12 13:34:10.369775] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23609e0 00:08:21.932 [2024-07-12 13:34:10.369788] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:21.932 [2024-07-12 13:34:10.371379] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:21.932 [2024-07-12 13:34:10.371406] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:21.932 [2024-07-12 13:34:10.511381] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:08:21.932 [2024-07-12 13:34:10.511444] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:08:21.932 [2024-07-12 13:34:10.511495] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:08:21.932 [2024-07-12 13:34:10.511565] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:08:21.932 [2024-07-12 13:34:10.511636] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:08:21.932 [2024-07-12 13:34:10.511665] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:08:21.932 [2024-07-12 13:34:10.511732] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:08:21.932 00:08:21.932 [2024-07-12 13:34:10.511769] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:08:22.499 00:08:22.499 real 0m1.054s 00:08:22.499 user 0m0.701s 00:08:22.499 sys 0m0.317s 00:08:22.499 13:34:10 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:22.499 13:34:10 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:08:22.500 ************************************ 00:08:22.500 END TEST bdev_hello_world 00:08:22.500 ************************************ 00:08:22.500 13:34:10 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:22.500 13:34:10 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:08:22.500 13:34:10 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:22.500 13:34:10 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:22.500 13:34:10 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:22.500 ************************************ 00:08:22.500 START TEST bdev_bounds 00:08:22.500 ************************************ 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=407553 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 407553' 00:08:22.500 Process bdevio pid: 407553 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 407553 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 407553 ']' 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:22.500 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:22.500 13:34:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:22.500 [2024-07-12 13:34:11.003267] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:22.500 [2024-07-12 13:34:11.003334] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid407553 ] 00:08:22.758 [2024-07-12 13:34:11.127704] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:22.758 [2024-07-12 13:34:11.228842] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:22.758 [2024-07-12 13:34:11.228954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:22.758 [2024-07-12 13:34:11.228958] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.017 [2024-07-12 13:34:11.377074] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:23.017 [2024-07-12 13:34:11.377137] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:23.017 [2024-07-12 13:34:11.377152] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:23.017 [2024-07-12 13:34:11.385083] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:23.017 [2024-07-12 13:34:11.385108] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:23.017 [2024-07-12 13:34:11.393099] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:23.017 [2024-07-12 13:34:11.393123] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:23.017 [2024-07-12 13:34:11.465729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:23.017 [2024-07-12 13:34:11.465778] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:23.017 [2024-07-12 13:34:11.465796] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x201ca00 00:08:23.017 [2024-07-12 13:34:11.465808] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:23.017 [2024-07-12 13:34:11.467251] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:23.017 [2024-07-12 13:34:11.467279] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:23.585 13:34:11 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:23.585 13:34:11 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:08:23.585 13:34:11 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:08:23.585 I/O targets: 00:08:23.585 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:08:23.585 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:08:23.585 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:08:23.585 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:08:23.585 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:08:23.585 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:08:23.585 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:08:23.585 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:08:23.585 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:08:23.585 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:08:23.585 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:08:23.585 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:08:23.585 raid0: 131072 blocks of 512 bytes (64 MiB) 00:08:23.585 concat0: 131072 blocks of 512 bytes (64 MiB) 00:08:23.585 raid1: 65536 blocks of 512 bytes (32 MiB) 00:08:23.585 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:08:23.585 00:08:23.585 00:08:23.585 CUnit - A unit testing framework for C - Version 2.1-3 00:08:23.585 http://cunit.sourceforge.net/ 00:08:23.585 00:08:23.585 00:08:23.585 Suite: bdevio tests on: AIO0 00:08:23.585 Test: blockdev write read block ...passed 00:08:23.585 Test: blockdev write zeroes read block ...passed 00:08:23.585 Test: blockdev write zeroes read no split ...passed 00:08:23.585 Test: blockdev write zeroes read split ...passed 00:08:23.585 Test: blockdev write zeroes read split partial ...passed 00:08:23.585 Test: blockdev reset ...passed 00:08:23.585 Test: blockdev write read 8 blocks ...passed 00:08:23.585 Test: blockdev write read size > 128k ...passed 00:08:23.585 Test: blockdev write read invalid size ...passed 00:08:23.585 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.585 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.585 Test: blockdev write read max offset ...passed 00:08:23.585 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.585 Test: blockdev writev readv 8 blocks ...passed 00:08:23.585 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.585 Test: blockdev writev readv block ...passed 00:08:23.585 Test: blockdev writev readv size > 128k ...passed 00:08:23.585 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.585 Test: blockdev comparev and writev ...passed 00:08:23.585 Test: blockdev nvme passthru rw ...passed 00:08:23.585 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.585 Test: blockdev nvme admin passthru ...passed 00:08:23.585 Test: blockdev copy ...passed 00:08:23.585 Suite: bdevio tests on: raid1 00:08:23.585 Test: blockdev write read block ...passed 00:08:23.585 Test: blockdev write zeroes read block ...passed 00:08:23.585 Test: blockdev write zeroes read no split ...passed 00:08:23.585 Test: blockdev write zeroes read split ...passed 00:08:23.585 Test: blockdev write zeroes read split partial ...passed 00:08:23.585 Test: blockdev reset ...passed 00:08:23.585 Test: blockdev write read 8 blocks ...passed 00:08:23.585 Test: blockdev write read size > 128k ...passed 00:08:23.585 Test: blockdev write read invalid size ...passed 00:08:23.585 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.585 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.585 Test: blockdev write read max offset ...passed 00:08:23.585 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.585 Test: blockdev writev readv 8 blocks ...passed 00:08:23.585 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.585 Test: blockdev writev readv block ...passed 00:08:23.585 Test: blockdev writev readv size > 128k ...passed 00:08:23.585 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.585 Test: blockdev comparev and writev ...passed 00:08:23.585 Test: blockdev nvme passthru rw ...passed 00:08:23.585 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.585 Test: blockdev nvme admin passthru ...passed 00:08:23.585 Test: blockdev copy ...passed 00:08:23.585 Suite: bdevio tests on: concat0 00:08:23.585 Test: blockdev write read block ...passed 00:08:23.585 Test: blockdev write zeroes read block ...passed 00:08:23.585 Test: blockdev write zeroes read no split ...passed 00:08:23.585 Test: blockdev write zeroes read split ...passed 00:08:23.585 Test: blockdev write zeroes read split partial ...passed 00:08:23.585 Test: blockdev reset ...passed 00:08:23.585 Test: blockdev write read 8 blocks ...passed 00:08:23.585 Test: blockdev write read size > 128k ...passed 00:08:23.585 Test: blockdev write read invalid size ...passed 00:08:23.585 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.585 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.585 Test: blockdev write read max offset ...passed 00:08:23.585 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.585 Test: blockdev writev readv 8 blocks ...passed 00:08:23.585 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.585 Test: blockdev writev readv block ...passed 00:08:23.585 Test: blockdev writev readv size > 128k ...passed 00:08:23.585 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.585 Test: blockdev comparev and writev ...passed 00:08:23.585 Test: blockdev nvme passthru rw ...passed 00:08:23.586 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.586 Test: blockdev nvme admin passthru ...passed 00:08:23.586 Test: blockdev copy ...passed 00:08:23.586 Suite: bdevio tests on: raid0 00:08:23.586 Test: blockdev write read block ...passed 00:08:23.586 Test: blockdev write zeroes read block ...passed 00:08:23.586 Test: blockdev write zeroes read no split ...passed 00:08:23.586 Test: blockdev write zeroes read split ...passed 00:08:23.586 Test: blockdev write zeroes read split partial ...passed 00:08:23.586 Test: blockdev reset ...passed 00:08:23.586 Test: blockdev write read 8 blocks ...passed 00:08:23.586 Test: blockdev write read size > 128k ...passed 00:08:23.586 Test: blockdev write read invalid size ...passed 00:08:23.586 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.586 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.586 Test: blockdev write read max offset ...passed 00:08:23.586 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.586 Test: blockdev writev readv 8 blocks ...passed 00:08:23.586 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.586 Test: blockdev writev readv block ...passed 00:08:23.586 Test: blockdev writev readv size > 128k ...passed 00:08:23.586 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.586 Test: blockdev comparev and writev ...passed 00:08:23.586 Test: blockdev nvme passthru rw ...passed 00:08:23.586 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.586 Test: blockdev nvme admin passthru ...passed 00:08:23.586 Test: blockdev copy ...passed 00:08:23.586 Suite: bdevio tests on: TestPT 00:08:23.586 Test: blockdev write read block ...passed 00:08:23.586 Test: blockdev write zeroes read block ...passed 00:08:23.586 Test: blockdev write zeroes read no split ...passed 00:08:23.586 Test: blockdev write zeroes read split ...passed 00:08:23.845 Test: blockdev write zeroes read split partial ...passed 00:08:23.846 Test: blockdev reset ...passed 00:08:23.846 Test: blockdev write read 8 blocks ...passed 00:08:23.846 Test: blockdev write read size > 128k ...passed 00:08:23.846 Test: blockdev write read invalid size ...passed 00:08:23.846 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.846 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.846 Test: blockdev write read max offset ...passed 00:08:23.846 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.846 Test: blockdev writev readv 8 blocks ...passed 00:08:23.846 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.846 Test: blockdev writev readv block ...passed 00:08:23.846 Test: blockdev writev readv size > 128k ...passed 00:08:23.846 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.846 Test: blockdev comparev and writev ...passed 00:08:23.846 Test: blockdev nvme passthru rw ...passed 00:08:23.846 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.846 Test: blockdev nvme admin passthru ...passed 00:08:23.846 Test: blockdev copy ...passed 00:08:23.846 Suite: bdevio tests on: Malloc2p7 00:08:23.846 Test: blockdev write read block ...passed 00:08:23.846 Test: blockdev write zeroes read block ...passed 00:08:23.846 Test: blockdev write zeroes read no split ...passed 00:08:23.846 Test: blockdev write zeroes read split ...passed 00:08:23.846 Test: blockdev write zeroes read split partial ...passed 00:08:23.846 Test: blockdev reset ...passed 00:08:23.846 Test: blockdev write read 8 blocks ...passed 00:08:23.846 Test: blockdev write read size > 128k ...passed 00:08:23.846 Test: blockdev write read invalid size ...passed 00:08:23.846 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.846 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.846 Test: blockdev write read max offset ...passed 00:08:23.846 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.846 Test: blockdev writev readv 8 blocks ...passed 00:08:23.846 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.846 Test: blockdev writev readv block ...passed 00:08:23.846 Test: blockdev writev readv size > 128k ...passed 00:08:23.846 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.846 Test: blockdev comparev and writev ...passed 00:08:23.846 Test: blockdev nvme passthru rw ...passed 00:08:23.846 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.846 Test: blockdev nvme admin passthru ...passed 00:08:23.846 Test: blockdev copy ...passed 00:08:23.846 Suite: bdevio tests on: Malloc2p6 00:08:23.846 Test: blockdev write read block ...passed 00:08:23.846 Test: blockdev write zeroes read block ...passed 00:08:23.846 Test: blockdev write zeroes read no split ...passed 00:08:23.846 Test: blockdev write zeroes read split ...passed 00:08:23.846 Test: blockdev write zeroes read split partial ...passed 00:08:23.846 Test: blockdev reset ...passed 00:08:23.846 Test: blockdev write read 8 blocks ...passed 00:08:23.846 Test: blockdev write read size > 128k ...passed 00:08:23.846 Test: blockdev write read invalid size ...passed 00:08:23.846 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.846 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.846 Test: blockdev write read max offset ...passed 00:08:23.846 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.846 Test: blockdev writev readv 8 blocks ...passed 00:08:23.846 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.846 Test: blockdev writev readv block ...passed 00:08:23.846 Test: blockdev writev readv size > 128k ...passed 00:08:23.846 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.846 Test: blockdev comparev and writev ...passed 00:08:23.846 Test: blockdev nvme passthru rw ...passed 00:08:23.846 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.846 Test: blockdev nvme admin passthru ...passed 00:08:23.846 Test: blockdev copy ...passed 00:08:23.846 Suite: bdevio tests on: Malloc2p5 00:08:23.846 Test: blockdev write read block ...passed 00:08:23.846 Test: blockdev write zeroes read block ...passed 00:08:23.846 Test: blockdev write zeroes read no split ...passed 00:08:23.846 Test: blockdev write zeroes read split ...passed 00:08:23.846 Test: blockdev write zeroes read split partial ...passed 00:08:23.846 Test: blockdev reset ...passed 00:08:23.846 Test: blockdev write read 8 blocks ...passed 00:08:23.846 Test: blockdev write read size > 128k ...passed 00:08:23.846 Test: blockdev write read invalid size ...passed 00:08:23.846 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.846 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.846 Test: blockdev write read max offset ...passed 00:08:23.846 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.846 Test: blockdev writev readv 8 blocks ...passed 00:08:23.846 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.846 Test: blockdev writev readv block ...passed 00:08:23.846 Test: blockdev writev readv size > 128k ...passed 00:08:23.846 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.846 Test: blockdev comparev and writev ...passed 00:08:23.846 Test: blockdev nvme passthru rw ...passed 00:08:23.846 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.846 Test: blockdev nvme admin passthru ...passed 00:08:23.846 Test: blockdev copy ...passed 00:08:23.846 Suite: bdevio tests on: Malloc2p4 00:08:23.846 Test: blockdev write read block ...passed 00:08:23.846 Test: blockdev write zeroes read block ...passed 00:08:23.846 Test: blockdev write zeroes read no split ...passed 00:08:23.846 Test: blockdev write zeroes read split ...passed 00:08:23.846 Test: blockdev write zeroes read split partial ...passed 00:08:23.846 Test: blockdev reset ...passed 00:08:23.846 Test: blockdev write read 8 blocks ...passed 00:08:23.846 Test: blockdev write read size > 128k ...passed 00:08:23.846 Test: blockdev write read invalid size ...passed 00:08:23.846 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.846 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.846 Test: blockdev write read max offset ...passed 00:08:23.846 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.846 Test: blockdev writev readv 8 blocks ...passed 00:08:23.846 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.846 Test: blockdev writev readv block ...passed 00:08:23.846 Test: blockdev writev readv size > 128k ...passed 00:08:23.846 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.846 Test: blockdev comparev and writev ...passed 00:08:23.846 Test: blockdev nvme passthru rw ...passed 00:08:23.846 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.846 Test: blockdev nvme admin passthru ...passed 00:08:23.846 Test: blockdev copy ...passed 00:08:23.846 Suite: bdevio tests on: Malloc2p3 00:08:23.846 Test: blockdev write read block ...passed 00:08:23.846 Test: blockdev write zeroes read block ...passed 00:08:23.846 Test: blockdev write zeroes read no split ...passed 00:08:23.846 Test: blockdev write zeroes read split ...passed 00:08:23.846 Test: blockdev write zeroes read split partial ...passed 00:08:23.846 Test: blockdev reset ...passed 00:08:23.846 Test: blockdev write read 8 blocks ...passed 00:08:23.846 Test: blockdev write read size > 128k ...passed 00:08:23.846 Test: blockdev write read invalid size ...passed 00:08:23.846 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.846 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.846 Test: blockdev write read max offset ...passed 00:08:23.846 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.846 Test: blockdev writev readv 8 blocks ...passed 00:08:23.846 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.846 Test: blockdev writev readv block ...passed 00:08:23.846 Test: blockdev writev readv size > 128k ...passed 00:08:23.846 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.846 Test: blockdev comparev and writev ...passed 00:08:23.846 Test: blockdev nvme passthru rw ...passed 00:08:23.846 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.846 Test: blockdev nvme admin passthru ...passed 00:08:23.846 Test: blockdev copy ...passed 00:08:23.846 Suite: bdevio tests on: Malloc2p2 00:08:23.846 Test: blockdev write read block ...passed 00:08:23.846 Test: blockdev write zeroes read block ...passed 00:08:23.846 Test: blockdev write zeroes read no split ...passed 00:08:23.846 Test: blockdev write zeroes read split ...passed 00:08:23.846 Test: blockdev write zeroes read split partial ...passed 00:08:23.846 Test: blockdev reset ...passed 00:08:23.846 Test: blockdev write read 8 blocks ...passed 00:08:23.846 Test: blockdev write read size > 128k ...passed 00:08:23.846 Test: blockdev write read invalid size ...passed 00:08:23.846 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.846 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.846 Test: blockdev write read max offset ...passed 00:08:23.846 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.846 Test: blockdev writev readv 8 blocks ...passed 00:08:23.846 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.846 Test: blockdev writev readv block ...passed 00:08:23.846 Test: blockdev writev readv size > 128k ...passed 00:08:23.846 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.846 Test: blockdev comparev and writev ...passed 00:08:23.846 Test: blockdev nvme passthru rw ...passed 00:08:23.846 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.846 Test: blockdev nvme admin passthru ...passed 00:08:23.846 Test: blockdev copy ...passed 00:08:23.846 Suite: bdevio tests on: Malloc2p1 00:08:23.846 Test: blockdev write read block ...passed 00:08:23.846 Test: blockdev write zeroes read block ...passed 00:08:23.846 Test: blockdev write zeroes read no split ...passed 00:08:23.846 Test: blockdev write zeroes read split ...passed 00:08:23.846 Test: blockdev write zeroes read split partial ...passed 00:08:23.846 Test: blockdev reset ...passed 00:08:23.846 Test: blockdev write read 8 blocks ...passed 00:08:23.846 Test: blockdev write read size > 128k ...passed 00:08:23.846 Test: blockdev write read invalid size ...passed 00:08:23.846 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.846 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.846 Test: blockdev write read max offset ...passed 00:08:23.846 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.846 Test: blockdev writev readv 8 blocks ...passed 00:08:23.846 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.846 Test: blockdev writev readv block ...passed 00:08:23.846 Test: blockdev writev readv size > 128k ...passed 00:08:23.846 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.847 Test: blockdev comparev and writev ...passed 00:08:23.847 Test: blockdev nvme passthru rw ...passed 00:08:23.847 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.847 Test: blockdev nvme admin passthru ...passed 00:08:23.847 Test: blockdev copy ...passed 00:08:23.847 Suite: bdevio tests on: Malloc2p0 00:08:23.847 Test: blockdev write read block ...passed 00:08:23.847 Test: blockdev write zeroes read block ...passed 00:08:23.847 Test: blockdev write zeroes read no split ...passed 00:08:23.847 Test: blockdev write zeroes read split ...passed 00:08:23.847 Test: blockdev write zeroes read split partial ...passed 00:08:23.847 Test: blockdev reset ...passed 00:08:23.847 Test: blockdev write read 8 blocks ...passed 00:08:23.847 Test: blockdev write read size > 128k ...passed 00:08:23.847 Test: blockdev write read invalid size ...passed 00:08:23.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.847 Test: blockdev write read max offset ...passed 00:08:23.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.847 Test: blockdev writev readv 8 blocks ...passed 00:08:23.847 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.847 Test: blockdev writev readv block ...passed 00:08:23.847 Test: blockdev writev readv size > 128k ...passed 00:08:23.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.847 Test: blockdev comparev and writev ...passed 00:08:23.847 Test: blockdev nvme passthru rw ...passed 00:08:23.847 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.847 Test: blockdev nvme admin passthru ...passed 00:08:23.847 Test: blockdev copy ...passed 00:08:23.847 Suite: bdevio tests on: Malloc1p1 00:08:23.847 Test: blockdev write read block ...passed 00:08:23.847 Test: blockdev write zeroes read block ...passed 00:08:23.847 Test: blockdev write zeroes read no split ...passed 00:08:23.847 Test: blockdev write zeroes read split ...passed 00:08:23.847 Test: blockdev write zeroes read split partial ...passed 00:08:23.847 Test: blockdev reset ...passed 00:08:23.847 Test: blockdev write read 8 blocks ...passed 00:08:23.847 Test: blockdev write read size > 128k ...passed 00:08:23.847 Test: blockdev write read invalid size ...passed 00:08:23.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.847 Test: blockdev write read max offset ...passed 00:08:23.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.847 Test: blockdev writev readv 8 blocks ...passed 00:08:23.847 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.847 Test: blockdev writev readv block ...passed 00:08:23.847 Test: blockdev writev readv size > 128k ...passed 00:08:23.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.847 Test: blockdev comparev and writev ...passed 00:08:23.847 Test: blockdev nvme passthru rw ...passed 00:08:23.847 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.847 Test: blockdev nvme admin passthru ...passed 00:08:23.847 Test: blockdev copy ...passed 00:08:23.847 Suite: bdevio tests on: Malloc1p0 00:08:23.847 Test: blockdev write read block ...passed 00:08:23.847 Test: blockdev write zeroes read block ...passed 00:08:23.847 Test: blockdev write zeroes read no split ...passed 00:08:23.847 Test: blockdev write zeroes read split ...passed 00:08:23.847 Test: blockdev write zeroes read split partial ...passed 00:08:23.847 Test: blockdev reset ...passed 00:08:23.847 Test: blockdev write read 8 blocks ...passed 00:08:23.847 Test: blockdev write read size > 128k ...passed 00:08:23.847 Test: blockdev write read invalid size ...passed 00:08:23.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.847 Test: blockdev write read max offset ...passed 00:08:23.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.847 Test: blockdev writev readv 8 blocks ...passed 00:08:23.847 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.847 Test: blockdev writev readv block ...passed 00:08:23.847 Test: blockdev writev readv size > 128k ...passed 00:08:23.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.847 Test: blockdev comparev and writev ...passed 00:08:23.847 Test: blockdev nvme passthru rw ...passed 00:08:23.847 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.847 Test: blockdev nvme admin passthru ...passed 00:08:23.847 Test: blockdev copy ...passed 00:08:23.847 Suite: bdevio tests on: Malloc0 00:08:23.847 Test: blockdev write read block ...passed 00:08:23.847 Test: blockdev write zeroes read block ...passed 00:08:23.847 Test: blockdev write zeroes read no split ...passed 00:08:23.847 Test: blockdev write zeroes read split ...passed 00:08:23.847 Test: blockdev write zeroes read split partial ...passed 00:08:23.847 Test: blockdev reset ...passed 00:08:23.847 Test: blockdev write read 8 blocks ...passed 00:08:23.847 Test: blockdev write read size > 128k ...passed 00:08:23.847 Test: blockdev write read invalid size ...passed 00:08:23.847 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:23.847 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:23.847 Test: blockdev write read max offset ...passed 00:08:23.847 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:23.847 Test: blockdev writev readv 8 blocks ...passed 00:08:23.847 Test: blockdev writev readv 30 x 1block ...passed 00:08:23.847 Test: blockdev writev readv block ...passed 00:08:23.847 Test: blockdev writev readv size > 128k ...passed 00:08:23.847 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:23.847 Test: blockdev comparev and writev ...passed 00:08:23.847 Test: blockdev nvme passthru rw ...passed 00:08:23.847 Test: blockdev nvme passthru vendor specific ...passed 00:08:23.847 Test: blockdev nvme admin passthru ...passed 00:08:23.847 Test: blockdev copy ...passed 00:08:23.847 00:08:23.847 Run Summary: Type Total Ran Passed Failed Inactive 00:08:23.847 suites 16 16 n/a 0 0 00:08:23.847 tests 368 368 368 0 0 00:08:23.847 asserts 2224 2224 2224 0 n/a 00:08:23.847 00:08:23.847 Elapsed time = 0.670 seconds 00:08:23.847 0 00:08:23.847 13:34:12 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 407553 00:08:23.847 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 407553 ']' 00:08:23.847 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 407553 00:08:23.847 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:08:23.847 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:23.847 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 407553 00:08:24.106 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:24.106 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:24.106 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 407553' 00:08:24.106 killing process with pid 407553 00:08:24.106 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 407553 00:08:24.106 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 407553 00:08:24.365 13:34:12 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:08:24.365 00:08:24.365 real 0m1.848s 00:08:24.365 user 0m4.620s 00:08:24.365 sys 0m0.493s 00:08:24.365 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:24.365 13:34:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:24.365 ************************************ 00:08:24.365 END TEST bdev_bounds 00:08:24.365 ************************************ 00:08:24.365 13:34:12 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:24.365 13:34:12 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:24.365 13:34:12 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:24.365 13:34:12 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:24.365 13:34:12 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:24.365 ************************************ 00:08:24.365 START TEST bdev_nbd 00:08:24.365 ************************************ 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=407929 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 407929 /var/tmp/spdk-nbd.sock 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 407929 ']' 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:24.365 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:24.365 13:34:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:24.624 [2024-07-12 13:34:12.950915] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:08:24.624 [2024-07-12 13:34:12.951001] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:24.624 [2024-07-12 13:34:13.084589] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.624 [2024-07-12 13:34:13.189566] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.883 [2024-07-12 13:34:13.355416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:24.883 [2024-07-12 13:34:13.355483] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:24.883 [2024-07-12 13:34:13.355498] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:24.883 [2024-07-12 13:34:13.363426] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:24.883 [2024-07-12 13:34:13.363461] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:24.883 [2024-07-12 13:34:13.371433] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:24.883 [2024-07-12 13:34:13.371457] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:24.883 [2024-07-12 13:34:13.448733] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:24.883 [2024-07-12 13:34:13.448787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:24.883 [2024-07-12 13:34:13.448806] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc9ccf0 00:08:24.883 [2024-07-12 13:34:13.448818] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:24.883 [2024-07-12 13:34:13.450309] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:24.883 [2024-07-12 13:34:13.450340] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:25.451 13:34:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:25.709 1+0 records in 00:08:25.709 1+0 records out 00:08:25.709 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259363 s, 15.8 MB/s 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:25.709 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:25.968 1+0 records in 00:08:25.968 1+0 records out 00:08:25.968 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254922 s, 16.1 MB/s 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:25.968 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:26.227 1+0 records in 00:08:26.227 1+0 records out 00:08:26.227 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000370116 s, 11.1 MB/s 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:26.227 13:34:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:26.486 1+0 records in 00:08:26.486 1+0 records out 00:08:26.486 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000346564 s, 11.8 MB/s 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:26.486 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:26.745 1+0 records in 00:08:26.745 1+0 records out 00:08:26.745 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000426946 s, 9.6 MB/s 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:26.745 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:08:27.004 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:27.263 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:27.263 1+0 records in 00:08:27.263 1+0 records out 00:08:27.264 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00040432 s, 10.1 MB/s 00:08:27.264 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.264 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:27.264 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.264 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:27.264 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:27.264 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:27.264 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:27.264 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:08:27.522 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:08:27.522 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:27.523 1+0 records in 00:08:27.523 1+0 records out 00:08:27.523 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000390375 s, 10.5 MB/s 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:27.523 13:34:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:08:27.781 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:08:27.781 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:08:27.781 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:08:27.781 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:27.781 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:27.781 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:27.781 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:27.781 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:27.781 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:27.782 1+0 records in 00:08:27.782 1+0 records out 00:08:27.782 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000460547 s, 8.9 MB/s 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:27.782 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:08:28.039 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:08:28.039 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:08:28.039 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:08:28.039 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:28.039 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:28.039 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.039 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.040 1+0 records in 00:08:28.040 1+0 records out 00:08:28.040 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000517293 s, 7.9 MB/s 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:28.040 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:08:28.297 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:08:28.297 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:08:28.297 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:08:28.297 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:28.297 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:28.297 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.298 1+0 records in 00:08:28.298 1+0 records out 00:08:28.298 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000517679 s, 7.9 MB/s 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:28.298 13:34:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:28.864 1+0 records in 00:08:28.864 1+0 records out 00:08:28.864 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000556638 s, 7.4 MB/s 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:28.864 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:08:29.121 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.122 1+0 records in 00:08:29.122 1+0 records out 00:08:29.122 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000591897 s, 6.9 MB/s 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:29.122 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.379 1+0 records in 00:08:29.379 1+0 records out 00:08:29.379 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00075881 s, 5.4 MB/s 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.379 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:29.380 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:29.380 13:34:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.637 1+0 records in 00:08:29.637 1+0 records out 00:08:29.637 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000762239 s, 5.4 MB/s 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:29.637 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:29.894 1+0 records in 00:08:29.894 1+0 records out 00:08:29.894 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000758468 s, 5.4 MB/s 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:29.894 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:30.151 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:30.151 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:30.409 1+0 records in 00:08:30.409 1+0 records out 00:08:30.409 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00087752 s, 4.7 MB/s 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:30.409 13:34:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd0", 00:08:30.667 "bdev_name": "Malloc0" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd1", 00:08:30.667 "bdev_name": "Malloc1p0" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd2", 00:08:30.667 "bdev_name": "Malloc1p1" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd3", 00:08:30.667 "bdev_name": "Malloc2p0" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd4", 00:08:30.667 "bdev_name": "Malloc2p1" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd5", 00:08:30.667 "bdev_name": "Malloc2p2" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd6", 00:08:30.667 "bdev_name": "Malloc2p3" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd7", 00:08:30.667 "bdev_name": "Malloc2p4" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd8", 00:08:30.667 "bdev_name": "Malloc2p5" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd9", 00:08:30.667 "bdev_name": "Malloc2p6" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd10", 00:08:30.667 "bdev_name": "Malloc2p7" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd11", 00:08:30.667 "bdev_name": "TestPT" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd12", 00:08:30.667 "bdev_name": "raid0" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd13", 00:08:30.667 "bdev_name": "concat0" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd14", 00:08:30.667 "bdev_name": "raid1" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd15", 00:08:30.667 "bdev_name": "AIO0" 00:08:30.667 } 00:08:30.667 ]' 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd0", 00:08:30.667 "bdev_name": "Malloc0" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd1", 00:08:30.667 "bdev_name": "Malloc1p0" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd2", 00:08:30.667 "bdev_name": "Malloc1p1" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd3", 00:08:30.667 "bdev_name": "Malloc2p0" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd4", 00:08:30.667 "bdev_name": "Malloc2p1" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd5", 00:08:30.667 "bdev_name": "Malloc2p2" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd6", 00:08:30.667 "bdev_name": "Malloc2p3" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd7", 00:08:30.667 "bdev_name": "Malloc2p4" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd8", 00:08:30.667 "bdev_name": "Malloc2p5" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd9", 00:08:30.667 "bdev_name": "Malloc2p6" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd10", 00:08:30.667 "bdev_name": "Malloc2p7" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd11", 00:08:30.667 "bdev_name": "TestPT" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd12", 00:08:30.667 "bdev_name": "raid0" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd13", 00:08:30.667 "bdev_name": "concat0" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd14", 00:08:30.667 "bdev_name": "raid1" 00:08:30.667 }, 00:08:30.667 { 00:08:30.667 "nbd_device": "/dev/nbd15", 00:08:30.667 "bdev_name": "AIO0" 00:08:30.667 } 00:08:30.667 ]' 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:30.667 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:30.925 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:30.925 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:30.925 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:30.925 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:30.925 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:30.925 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:30.925 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:30.925 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:30.925 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:30.925 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:31.183 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:31.183 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:31.183 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:31.183 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:31.183 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:31.183 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:31.183 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:31.183 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:31.183 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:31.183 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:31.441 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:31.441 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:31.441 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:31.441 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:31.441 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:31.441 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:31.441 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:31.441 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:31.441 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:31.441 13:34:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:31.699 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:31.699 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:31.699 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:31.699 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:31.699 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:31.699 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:31.699 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:31.699 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:31.699 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:31.699 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:31.957 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:31.957 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:31.957 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:31.957 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:31.957 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:31.957 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:31.957 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:31.957 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:31.957 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:31.957 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:32.215 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:32.215 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:32.215 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:32.215 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:32.215 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:32.215 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:32.215 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:32.215 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:32.215 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:32.215 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:32.473 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:32.473 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:32.473 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:32.473 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:32.473 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:32.473 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:32.473 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:32.473 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:32.473 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:32.473 13:34:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:32.733 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:32.733 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:32.733 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:32.733 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:32.733 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:32.733 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:32.733 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:32.733 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:32.733 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:32.733 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:32.991 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:32.991 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:32.991 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:32.991 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:32.991 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:32.991 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:32.991 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:32.991 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:32.991 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:32.991 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:33.249 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:33.249 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:33.249 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:33.249 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:33.249 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:33.249 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:33.249 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:33.250 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:33.250 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:33.250 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:33.508 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:33.508 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:33.508 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:33.508 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:33.508 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:33.508 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:33.508 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:33.508 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:33.508 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:33.508 13:34:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:33.767 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:33.767 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:33.767 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:33.767 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:33.767 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:33.767 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:33.767 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:33.767 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:33.767 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:33.767 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:34.333 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:34.333 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:34.333 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:34.333 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.333 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.333 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:34.333 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.333 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.333 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.333 13:34:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:34.590 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:34.590 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:34.590 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:34.590 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:34.590 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:34.591 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:34.591 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:34.591 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:34.591 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:34.591 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:35.155 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:35.155 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:35.155 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:35.155 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.155 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.155 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:35.155 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.155 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.155 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:35.155 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:35.413 13:34:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:35.671 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:35.929 /dev/nbd0 00:08:35.929 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:35.929 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:35.929 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:35.929 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:35.929 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:35.929 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:35.929 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:35.930 1+0 records in 00:08:35.930 1+0 records out 00:08:35.930 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249681 s, 16.4 MB/s 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:35.930 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:36.188 /dev/nbd1 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:36.188 1+0 records in 00:08:36.188 1+0 records out 00:08:36.188 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261745 s, 15.6 MB/s 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:36.188 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:36.447 /dev/nbd10 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:36.447 1+0 records in 00:08:36.447 1+0 records out 00:08:36.447 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000342296 s, 12.0 MB/s 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:36.447 13:34:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.447 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:36.447 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:36.447 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:36.447 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:36.447 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:36.715 /dev/nbd11 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:36.715 1+0 records in 00:08:36.715 1+0 records out 00:08:36.715 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000332886 s, 12.3 MB/s 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:36.715 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:08:36.990 /dev/nbd12 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:36.990 1+0 records in 00:08:36.990 1+0 records out 00:08:36.990 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000450736 s, 9.1 MB/s 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:36.990 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:08:37.273 /dev/nbd13 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:37.273 1+0 records in 00:08:37.273 1+0 records out 00:08:37.273 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000462148 s, 8.9 MB/s 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:37.273 13:34:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:08:37.548 /dev/nbd14 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:37.548 1+0 records in 00:08:37.548 1+0 records out 00:08:37.548 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000562809 s, 7.3 MB/s 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:37.548 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:08:37.822 /dev/nbd15 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:37.822 1+0 records in 00:08:37.822 1+0 records out 00:08:37.822 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000500457 s, 8.2 MB/s 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:37.822 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:08:38.095 /dev/nbd2 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.095 1+0 records in 00:08:38.095 1+0 records out 00:08:38.095 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000466719 s, 8.8 MB/s 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:38.095 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:08:38.366 /dev/nbd3 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.366 1+0 records in 00:08:38.366 1+0 records out 00:08:38.366 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000590743 s, 6.9 MB/s 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:38.366 13:34:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:08:38.655 /dev/nbd4 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.655 1+0 records in 00:08:38.655 1+0 records out 00:08:38.655 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000731312 s, 5.6 MB/s 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:38.655 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:08:38.935 /dev/nbd5 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.935 1+0 records in 00:08:38.935 1+0 records out 00:08:38.935 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000618553 s, 6.6 MB/s 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:38.935 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:08:39.196 /dev/nbd6 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:39.196 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.196 1+0 records in 00:08:39.196 1+0 records out 00:08:39.196 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000737033 s, 5.6 MB/s 00:08:39.197 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.197 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:39.197 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.197 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:39.197 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:39.197 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:39.197 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:39.197 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:08:39.465 /dev/nbd7 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.465 1+0 records in 00:08:39.465 1+0 records out 00:08:39.465 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000599586 s, 6.8 MB/s 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:39.465 13:34:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:08:39.740 /dev/nbd8 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.740 1+0 records in 00:08:39.740 1+0 records out 00:08:39.740 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000691993 s, 5.9 MB/s 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:39.740 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:08:40.011 /dev/nbd9 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:40.011 1+0 records in 00:08:40.011 1+0 records out 00:08:40.011 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000887369 s, 4.6 MB/s 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:40.011 13:34:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:40.673 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:40.673 { 00:08:40.673 "nbd_device": "/dev/nbd0", 00:08:40.673 "bdev_name": "Malloc0" 00:08:40.673 }, 00:08:40.673 { 00:08:40.673 "nbd_device": "/dev/nbd1", 00:08:40.673 "bdev_name": "Malloc1p0" 00:08:40.673 }, 00:08:40.673 { 00:08:40.673 "nbd_device": "/dev/nbd10", 00:08:40.673 "bdev_name": "Malloc1p1" 00:08:40.673 }, 00:08:40.673 { 00:08:40.673 "nbd_device": "/dev/nbd11", 00:08:40.673 "bdev_name": "Malloc2p0" 00:08:40.673 }, 00:08:40.673 { 00:08:40.673 "nbd_device": "/dev/nbd12", 00:08:40.673 "bdev_name": "Malloc2p1" 00:08:40.673 }, 00:08:40.673 { 00:08:40.673 "nbd_device": "/dev/nbd13", 00:08:40.673 "bdev_name": "Malloc2p2" 00:08:40.673 }, 00:08:40.673 { 00:08:40.674 "nbd_device": "/dev/nbd14", 00:08:40.674 "bdev_name": "Malloc2p3" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd15", 00:08:40.674 "bdev_name": "Malloc2p4" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd2", 00:08:40.674 "bdev_name": "Malloc2p5" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd3", 00:08:40.674 "bdev_name": "Malloc2p6" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd4", 00:08:40.674 "bdev_name": "Malloc2p7" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd5", 00:08:40.674 "bdev_name": "TestPT" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd6", 00:08:40.674 "bdev_name": "raid0" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd7", 00:08:40.674 "bdev_name": "concat0" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd8", 00:08:40.674 "bdev_name": "raid1" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd9", 00:08:40.674 "bdev_name": "AIO0" 00:08:40.674 } 00:08:40.674 ]' 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd0", 00:08:40.674 "bdev_name": "Malloc0" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd1", 00:08:40.674 "bdev_name": "Malloc1p0" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd10", 00:08:40.674 "bdev_name": "Malloc1p1" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd11", 00:08:40.674 "bdev_name": "Malloc2p0" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd12", 00:08:40.674 "bdev_name": "Malloc2p1" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd13", 00:08:40.674 "bdev_name": "Malloc2p2" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd14", 00:08:40.674 "bdev_name": "Malloc2p3" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd15", 00:08:40.674 "bdev_name": "Malloc2p4" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd2", 00:08:40.674 "bdev_name": "Malloc2p5" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd3", 00:08:40.674 "bdev_name": "Malloc2p6" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd4", 00:08:40.674 "bdev_name": "Malloc2p7" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd5", 00:08:40.674 "bdev_name": "TestPT" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd6", 00:08:40.674 "bdev_name": "raid0" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd7", 00:08:40.674 "bdev_name": "concat0" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd8", 00:08:40.674 "bdev_name": "raid1" 00:08:40.674 }, 00:08:40.674 { 00:08:40.674 "nbd_device": "/dev/nbd9", 00:08:40.674 "bdev_name": "AIO0" 00:08:40.674 } 00:08:40.674 ]' 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:40.674 /dev/nbd1 00:08:40.674 /dev/nbd10 00:08:40.674 /dev/nbd11 00:08:40.674 /dev/nbd12 00:08:40.674 /dev/nbd13 00:08:40.674 /dev/nbd14 00:08:40.674 /dev/nbd15 00:08:40.674 /dev/nbd2 00:08:40.674 /dev/nbd3 00:08:40.674 /dev/nbd4 00:08:40.674 /dev/nbd5 00:08:40.674 /dev/nbd6 00:08:40.674 /dev/nbd7 00:08:40.674 /dev/nbd8 00:08:40.674 /dev/nbd9' 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:40.674 /dev/nbd1 00:08:40.674 /dev/nbd10 00:08:40.674 /dev/nbd11 00:08:40.674 /dev/nbd12 00:08:40.674 /dev/nbd13 00:08:40.674 /dev/nbd14 00:08:40.674 /dev/nbd15 00:08:40.674 /dev/nbd2 00:08:40.674 /dev/nbd3 00:08:40.674 /dev/nbd4 00:08:40.674 /dev/nbd5 00:08:40.674 /dev/nbd6 00:08:40.674 /dev/nbd7 00:08:40.674 /dev/nbd8 00:08:40.674 /dev/nbd9' 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:08:40.674 256+0 records in 00:08:40.674 256+0 records out 00:08:40.674 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109847 s, 95.5 MB/s 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:40.674 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:40.955 256+0 records in 00:08:40.955 256+0 records out 00:08:40.955 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.171154 s, 6.1 MB/s 00:08:40.955 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:40.955 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:40.955 256+0 records in 00:08:40.955 256+0 records out 00:08:40.955 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.141123 s, 7.4 MB/s 00:08:40.955 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:40.955 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:08:41.229 256+0 records in 00:08:41.229 256+0 records out 00:08:41.229 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.133227 s, 7.9 MB/s 00:08:41.229 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:41.229 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:08:41.229 256+0 records in 00:08:41.229 256+0 records out 00:08:41.229 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.176907 s, 5.9 MB/s 00:08:41.229 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:41.229 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:08:41.528 256+0 records in 00:08:41.528 256+0 records out 00:08:41.528 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.196423 s, 5.3 MB/s 00:08:41.528 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:41.528 13:34:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:08:41.797 256+0 records in 00:08:41.797 256+0 records out 00:08:41.797 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.186351 s, 5.6 MB/s 00:08:41.797 13:34:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:41.797 13:34:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:08:42.070 256+0 records in 00:08:42.070 256+0 records out 00:08:42.070 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.228085 s, 4.6 MB/s 00:08:42.070 13:34:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:42.070 13:34:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:08:42.070 256+0 records in 00:08:42.070 256+0 records out 00:08:42.071 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.213878 s, 4.9 MB/s 00:08:42.071 13:34:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:42.071 13:34:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:08:42.338 256+0 records in 00:08:42.338 256+0 records out 00:08:42.338 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.156807 s, 6.7 MB/s 00:08:42.338 13:34:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:42.338 13:34:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:08:42.602 256+0 records in 00:08:42.602 256+0 records out 00:08:42.602 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.203006 s, 5.2 MB/s 00:08:42.602 13:34:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:42.602 13:34:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:08:42.602 256+0 records in 00:08:42.602 256+0 records out 00:08:42.602 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.173924 s, 6.0 MB/s 00:08:42.602 13:34:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:42.602 13:34:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:08:42.859 256+0 records in 00:08:42.860 256+0 records out 00:08:42.860 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.13444 s, 7.8 MB/s 00:08:42.860 13:34:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:42.860 13:34:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:08:43.116 256+0 records in 00:08:43.116 256+0 records out 00:08:43.116 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.220324 s, 4.8 MB/s 00:08:43.116 13:34:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:43.116 13:34:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:08:43.116 256+0 records in 00:08:43.116 256+0 records out 00:08:43.116 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.153575 s, 6.8 MB/s 00:08:43.116 13:34:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:43.116 13:34:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:08:43.375 256+0 records in 00:08:43.375 256+0 records out 00:08:43.375 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.169404 s, 6.2 MB/s 00:08:43.375 13:34:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:43.375 13:34:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:08:43.634 256+0 records in 00:08:43.634 256+0 records out 00:08:43.634 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.178079 s, 5.9 MB/s 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:43.634 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:43.893 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:43.893 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:43.893 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:43.893 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:43.893 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:43.893 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:43.893 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:43.893 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:43.893 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:43.893 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:44.460 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:44.460 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:44.460 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:44.461 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:44.461 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:44.461 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:44.461 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:44.461 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:44.461 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:44.461 13:34:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:44.461 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:44.461 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:44.461 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:44.461 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:44.461 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:44.461 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:44.461 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:44.461 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:44.461 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:44.461 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:44.719 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:44.719 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:44.719 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:44.719 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:44.719 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:44.719 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:44.719 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:44.719 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:44.719 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:44.719 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:44.977 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:44.977 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:44.977 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:44.977 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:44.977 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:44.977 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:44.977 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:44.977 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:44.977 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:44.977 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:45.236 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:45.236 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:45.236 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:45.236 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:45.236 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:45.236 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:45.236 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:45.236 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:45.236 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:45.236 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:45.495 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:45.495 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:45.495 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:45.495 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:45.495 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:45.495 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:45.495 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:45.495 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:45.495 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:45.495 13:34:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:45.753 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:45.753 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:45.753 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:45.753 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:45.753 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:45.753 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:45.753 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:45.753 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:45.753 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:45.753 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:46.012 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:46.012 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:46.012 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:46.012 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.012 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.012 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:46.012 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.012 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.012 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.012 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:46.270 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:46.270 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:46.270 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:46.270 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.270 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.270 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:46.270 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.270 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.270 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.270 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:46.529 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:46.529 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:46.529 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:46.529 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.529 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.529 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:46.529 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.529 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.529 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.529 13:34:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:46.788 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:46.788 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:46.788 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:46.788 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:46.789 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:47.047 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:47.047 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:47.047 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:47.047 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.047 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.047 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:47.047 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.047 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.047 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.047 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:47.305 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:47.305 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:47.305 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:47.305 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.305 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.305 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:47.305 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.305 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.305 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.305 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:47.563 13:34:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:08:47.821 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:08:48.080 malloc_lvol_verify 00:08:48.080 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:08:48.338 b522d122-6778-463c-859e-79a45865cfbb 00:08:48.339 13:34:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:08:48.597 64924245-0fee-45b5-a952-9e0a5175f90d 00:08:48.597 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:08:48.856 /dev/nbd0 00:08:48.856 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:08:48.856 mke2fs 1.46.5 (30-Dec-2021) 00:08:48.856 Discarding device blocks: 0/4096 done 00:08:48.856 Creating filesystem with 4096 1k blocks and 1024 inodes 00:08:48.856 00:08:48.856 Allocating group tables: 0/1 done 00:08:48.856 Writing inode tables: 0/1 done 00:08:48.856 Creating journal (1024 blocks): done 00:08:48.856 Writing superblocks and filesystem accounting information: 0/1 done 00:08:48.856 00:08:48.856 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:08:48.856 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:08:48.856 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:48.856 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:48.856 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:48.856 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:48.856 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.856 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 407929 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 407929 ']' 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 407929 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 407929 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 407929' 00:08:49.116 killing process with pid 407929 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 407929 00:08:49.116 13:34:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 407929 00:08:49.685 13:34:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:08:49.685 00:08:49.685 real 0m25.231s 00:08:49.685 user 0m31.028s 00:08:49.685 sys 0m14.430s 00:08:49.685 13:34:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:49.685 13:34:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:49.685 ************************************ 00:08:49.685 END TEST bdev_nbd 00:08:49.685 ************************************ 00:08:49.685 13:34:38 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:49.685 13:34:38 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:08:49.685 13:34:38 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:08:49.685 13:34:38 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:08:49.685 13:34:38 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:08:49.685 13:34:38 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:49.685 13:34:38 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:49.685 13:34:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:49.685 ************************************ 00:08:49.685 START TEST bdev_fio 00:08:49.685 ************************************ 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:49.685 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.685 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:49.945 13:34:38 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:49.945 ************************************ 00:08:49.945 START TEST bdev_fio_rw_verify 00:08:49.945 ************************************ 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:49.945 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:49.946 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:08:49.946 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:49.946 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:49.946 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:49.946 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:49.946 13:34:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:50.205 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:50.205 fio-3.35 00:08:50.205 Starting 16 threads 00:09:02.408 00:09:02.408 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=411972: Fri Jul 12 13:34:50 2024 00:09:02.408 read: IOPS=84.8k, BW=331MiB/s (347MB/s)(3314MiB/10001msec) 00:09:02.408 slat (usec): min=2, max=404, avg=37.33, stdev=17.03 00:09:02.408 clat (usec): min=11, max=19748, avg=304.89, stdev=166.04 00:09:02.408 lat (usec): min=22, max=19792, avg=342.22, stdev=175.89 00:09:02.408 clat percentiles (usec): 00:09:02.409 | 50.000th=[ 285], 99.000th=[ 734], 99.900th=[ 865], 99.990th=[ 1037], 00:09:02.409 | 99.999th=[ 1926] 00:09:02.409 write: IOPS=131k, BW=513MiB/s (538MB/s)(5063MiB/9864msec); 0 zone resets 00:09:02.409 slat (usec): min=6, max=4351, avg=52.84, stdev=19.82 00:09:02.409 clat (usec): min=13, max=4813, avg=369.53, stdev=187.91 00:09:02.409 lat (usec): min=39, max=4849, avg=422.37, stdev=200.02 00:09:02.409 clat percentiles (usec): 00:09:02.409 | 50.000th=[ 343], 99.000th=[ 938], 99.900th=[ 1057], 99.990th=[ 1156], 00:09:02.409 | 99.999th=[ 1319] 00:09:02.409 bw ( KiB/s): min=399664, max=763274, per=99.35%, avg=522161.42, stdev=6906.12, samples=304 00:09:02.409 iops : min=99916, max=190818, avg=130540.26, stdev=1726.52, samples=304 00:09:02.409 lat (usec) : 20=0.01%, 50=0.45%, 100=4.21%, 250=30.06%, 500=46.80% 00:09:02.409 lat (usec) : 750=15.72%, 1000=2.54% 00:09:02.409 lat (msec) : 2=0.22%, 10=0.01%, 20=0.01% 00:09:02.409 cpu : usr=99.16%, sys=0.39%, ctx=702, majf=0, minf=1967 00:09:02.409 IO depths : 1=12.4%, 2=24.8%, 4=50.2%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:02.409 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:02.409 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:02.409 issued rwts: total=848305,1296065,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:02.409 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:02.409 00:09:02.409 Run status group 0 (all jobs): 00:09:02.409 READ: bw=331MiB/s (347MB/s), 331MiB/s-331MiB/s (347MB/s-347MB/s), io=3314MiB (3475MB), run=10001-10001msec 00:09:02.409 WRITE: bw=513MiB/s (538MB/s), 513MiB/s-513MiB/s (538MB/s-538MB/s), io=5063MiB (5309MB), run=9864-9864msec 00:09:02.409 00:09:02.409 real 0m12.049s 00:09:02.409 user 2m45.774s 00:09:02.409 sys 0m1.425s 00:09:02.409 13:34:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:02.409 13:34:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:02.409 ************************************ 00:09:02.409 END TEST bdev_fio_rw_verify 00:09:02.409 ************************************ 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:02.409 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:02.410 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b54295b6-dfcd-4f0c-bcec-95fc3f9789dd"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b54295b6-dfcd-4f0c-bcec-95fc3f9789dd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "16db39df-b484-59f3-892f-38fcbfdf91d4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "16db39df-b484-59f3-892f-38fcbfdf91d4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "e9bdf4d1-adc3-5abc-8d90-916baecd391a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e9bdf4d1-adc3-5abc-8d90-916baecd391a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "d5978583-f4ee-5a7a-82e0-668a13a80b62"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d5978583-f4ee-5a7a-82e0-668a13a80b62",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "602aac2f-9901-5a87-ae3e-d1e856e8cbe5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "602aac2f-9901-5a87-ae3e-d1e856e8cbe5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "eca3a967-15ab-5017-84b3-9f4fd5d8a194"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "eca3a967-15ab-5017-84b3-9f4fd5d8a194",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "3fa35bdb-c241-52ad-9e8b-cf39e99be0e0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3fa35bdb-c241-52ad-9e8b-cf39e99be0e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "47519d7d-3387-58d9-87bc-00fe1479d37f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "47519d7d-3387-58d9-87bc-00fe1479d37f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "46978a68-410e-5a69-86f9-c63d78064cd3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "46978a68-410e-5a69-86f9-c63d78064cd3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "fa65e3d7-a815-5b29-91be-a747ab168765"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fa65e3d7-a815-5b29-91be-a747ab168765",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "2c4fbf9a-94b6-5921-bdf5-d3fb7fa20e98"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2c4fbf9a-94b6-5921-bdf5-d3fb7fa20e98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "8a872f31-e213-59da-ad5c-4bbc641cf23f"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8a872f31-e213-59da-ad5c-4bbc641cf23f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "887a9296-4a15-4ae1-bfe4-a25bfd1ba94d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "887a9296-4a15-4ae1-bfe4-a25bfd1ba94d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "887a9296-4a15-4ae1-bfe4-a25bfd1ba94d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "82f065d8-cba3-44be-95f5-36b3d63dbfc4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "a3677b05-119b-4a6d-a3b6-08fe1271fab9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "f6a2fd49-906b-41f2-8a76-b7f9d08dfd67"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f6a2fd49-906b-41f2-8a76-b7f9d08dfd67",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f6a2fd49-906b-41f2-8a76-b7f9d08dfd67",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "0b590ef9-e259-4325-a063-cc78c88dbec2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "134cde33-6b1b-49c6-99c6-d49831308134",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "0a286f5e-98f2-429f-b91c-d126e3ce2d3c"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0a286f5e-98f2-429f-b91c-d126e3ce2d3c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0a286f5e-98f2-429f-b91c-d126e3ce2d3c",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "e03d49df-bd3f-4144-a6bd-091b43bd4889",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "23778c45-1042-463e-be30-84bdde1bd4df",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "18ad1597-9b1d-4116-aa64-74f795158054"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "18ad1597-9b1d-4116-aa64-74f795158054",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:02.410 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:09:02.410 Malloc1p0 00:09:02.410 Malloc1p1 00:09:02.410 Malloc2p0 00:09:02.410 Malloc2p1 00:09:02.410 Malloc2p2 00:09:02.410 Malloc2p3 00:09:02.410 Malloc2p4 00:09:02.410 Malloc2p5 00:09:02.410 Malloc2p6 00:09:02.410 Malloc2p7 00:09:02.410 TestPT 00:09:02.410 raid0 00:09:02.410 concat0 ]] 00:09:02.410 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b54295b6-dfcd-4f0c-bcec-95fc3f9789dd"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b54295b6-dfcd-4f0c-bcec-95fc3f9789dd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "16db39df-b484-59f3-892f-38fcbfdf91d4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "16db39df-b484-59f3-892f-38fcbfdf91d4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "e9bdf4d1-adc3-5abc-8d90-916baecd391a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e9bdf4d1-adc3-5abc-8d90-916baecd391a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "d5978583-f4ee-5a7a-82e0-668a13a80b62"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d5978583-f4ee-5a7a-82e0-668a13a80b62",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "602aac2f-9901-5a87-ae3e-d1e856e8cbe5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "602aac2f-9901-5a87-ae3e-d1e856e8cbe5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "eca3a967-15ab-5017-84b3-9f4fd5d8a194"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "eca3a967-15ab-5017-84b3-9f4fd5d8a194",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "3fa35bdb-c241-52ad-9e8b-cf39e99be0e0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3fa35bdb-c241-52ad-9e8b-cf39e99be0e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "47519d7d-3387-58d9-87bc-00fe1479d37f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "47519d7d-3387-58d9-87bc-00fe1479d37f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "46978a68-410e-5a69-86f9-c63d78064cd3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "46978a68-410e-5a69-86f9-c63d78064cd3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "fa65e3d7-a815-5b29-91be-a747ab168765"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fa65e3d7-a815-5b29-91be-a747ab168765",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "2c4fbf9a-94b6-5921-bdf5-d3fb7fa20e98"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2c4fbf9a-94b6-5921-bdf5-d3fb7fa20e98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "8a872f31-e213-59da-ad5c-4bbc641cf23f"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8a872f31-e213-59da-ad5c-4bbc641cf23f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "887a9296-4a15-4ae1-bfe4-a25bfd1ba94d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "887a9296-4a15-4ae1-bfe4-a25bfd1ba94d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "887a9296-4a15-4ae1-bfe4-a25bfd1ba94d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "82f065d8-cba3-44be-95f5-36b3d63dbfc4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "a3677b05-119b-4a6d-a3b6-08fe1271fab9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "f6a2fd49-906b-41f2-8a76-b7f9d08dfd67"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f6a2fd49-906b-41f2-8a76-b7f9d08dfd67",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f6a2fd49-906b-41f2-8a76-b7f9d08dfd67",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "0b590ef9-e259-4325-a063-cc78c88dbec2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "134cde33-6b1b-49c6-99c6-d49831308134",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "0a286f5e-98f2-429f-b91c-d126e3ce2d3c"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0a286f5e-98f2-429f-b91c-d126e3ce2d3c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0a286f5e-98f2-429f-b91c-d126e3ce2d3c",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "e03d49df-bd3f-4144-a6bd-091b43bd4889",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "23778c45-1042-463e-be30-84bdde1bd4df",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "18ad1597-9b1d-4116-aa64-74f795158054"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "18ad1597-9b1d-4116-aa64-74f795158054",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:09:02.411 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:02.412 13:34:50 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:02.412 ************************************ 00:09:02.412 START TEST bdev_fio_trim 00:09:02.412 ************************************ 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:02.412 13:34:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:02.412 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:02.412 fio-3.35 00:09:02.412 Starting 14 threads 00:09:14.639 00:09:14.639 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=413752: Fri Jul 12 13:35:01 2024 00:09:14.639 write: IOPS=123k, BW=480MiB/s (503MB/s)(4801MiB/10002msec); 0 zone resets 00:09:14.639 slat (usec): min=2, max=3288, avg=40.35, stdev=10.98 00:09:14.639 clat (usec): min=27, max=1699, avg=285.26, stdev=93.52 00:09:14.639 lat (usec): min=43, max=3701, avg=325.61, stdev=96.84 00:09:14.639 clat percentiles (usec): 00:09:14.639 | 50.000th=[ 277], 99.000th=[ 486], 99.900th=[ 537], 99.990th=[ 594], 00:09:14.639 | 99.999th=[ 750] 00:09:14.639 bw ( KiB/s): min=446264, max=629721, per=100.00%, avg=492706.37, stdev=3007.91, samples=266 00:09:14.639 iops : min=111566, max=157430, avg=123176.63, stdev=751.97, samples=266 00:09:14.639 trim: IOPS=123k, BW=480MiB/s (503MB/s)(4801MiB/10002msec); 0 zone resets 00:09:14.639 slat (usec): min=4, max=1142, avg=27.06, stdev= 6.82 00:09:14.639 clat (usec): min=8, max=3701, avg=324.14, stdev=98.55 00:09:14.639 lat (usec): min=22, max=3719, avg=351.20, stdev=100.99 00:09:14.639 clat percentiles (usec): 00:09:14.639 | 50.000th=[ 318], 99.000th=[ 537], 99.900th=[ 586], 99.990th=[ 652], 00:09:14.639 | 99.999th=[ 799] 00:09:14.639 bw ( KiB/s): min=446264, max=629729, per=100.00%, avg=492707.21, stdev=3007.91, samples=266 00:09:14.639 iops : min=111566, max=157432, avg=123176.74, stdev=751.97, samples=266 00:09:14.639 lat (usec) : 10=0.01%, 20=0.01%, 50=0.03%, 100=0.61%, 250=32.14% 00:09:14.639 lat (usec) : 500=65.25%, 750=1.97%, 1000=0.01% 00:09:14.639 lat (msec) : 2=0.01%, 4=0.01% 00:09:14.639 cpu : usr=99.61%, sys=0.01%, ctx=617, majf=0, minf=1033 00:09:14.639 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:14.639 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:14.639 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:14.639 issued rwts: total=0,1229133,1229136,0 short=0,0,0,0 dropped=0,0,0,0 00:09:14.639 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:14.639 00:09:14.639 Run status group 0 (all jobs): 00:09:14.639 WRITE: bw=480MiB/s (503MB/s), 480MiB/s-480MiB/s (503MB/s-503MB/s), io=4801MiB (5035MB), run=10002-10002msec 00:09:14.639 TRIM: bw=480MiB/s (503MB/s), 480MiB/s-480MiB/s (503MB/s-503MB/s), io=4801MiB (5035MB), run=10002-10002msec 00:09:14.639 00:09:14.639 real 0m11.666s 00:09:14.639 user 2m26.171s 00:09:14.639 sys 0m0.744s 00:09:14.639 13:35:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:14.639 13:35:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:14.639 ************************************ 00:09:14.639 END TEST bdev_fio_trim 00:09:14.639 ************************************ 00:09:14.639 13:35:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:14.639 13:35:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:14.639 13:35:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:14.639 13:35:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:14.639 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:14.639 13:35:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:14.639 00:09:14.639 real 0m24.097s 00:09:14.639 user 5m12.166s 00:09:14.639 sys 0m2.361s 00:09:14.639 13:35:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:14.639 13:35:02 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:14.639 ************************************ 00:09:14.639 END TEST bdev_fio 00:09:14.639 ************************************ 00:09:14.639 13:35:02 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:14.639 13:35:02 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:14.639 13:35:02 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:14.639 13:35:02 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:14.639 13:35:02 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:14.639 13:35:02 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:14.639 ************************************ 00:09:14.639 START TEST bdev_verify 00:09:14.639 ************************************ 00:09:14.639 13:35:02 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:14.639 [2024-07-12 13:35:02.442172] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:09:14.639 [2024-07-12 13:35:02.442239] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid415211 ] 00:09:14.639 [2024-07-12 13:35:02.573258] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:14.639 [2024-07-12 13:35:02.677806] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:14.639 [2024-07-12 13:35:02.677811] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.639 [2024-07-12 13:35:02.833585] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:14.639 [2024-07-12 13:35:02.833644] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:14.639 [2024-07-12 13:35:02.833658] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:14.639 [2024-07-12 13:35:02.841595] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:14.639 [2024-07-12 13:35:02.841622] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:14.639 [2024-07-12 13:35:02.849603] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:14.639 [2024-07-12 13:35:02.849627] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:14.639 [2024-07-12 13:35:02.926812] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:14.639 [2024-07-12 13:35:02.926859] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:14.639 [2024-07-12 13:35:02.926876] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a9bf00 00:09:14.639 [2024-07-12 13:35:02.926888] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:14.639 [2024-07-12 13:35:02.928402] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:14.639 [2024-07-12 13:35:02.928429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:14.639 Running I/O for 5 seconds... 00:09:21.197 00:09:21.197 Latency(us) 00:09:21.197 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:21.197 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.197 Verification LBA range: start 0x0 length 0x1000 00:09:21.197 Malloc0 : 5.09 1206.84 4.71 0.00 0.00 105868.36 548.51 339191.54 00:09:21.197 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x1000 length 0x1000 00:09:21.198 Malloc0 : 5.09 956.50 3.74 0.00 0.00 133540.69 747.97 406665.13 00:09:21.198 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x800 00:09:21.198 Malloc1p0 : 5.09 628.30 2.45 0.00 0.00 202873.36 2478.97 172331.19 00:09:21.198 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x800 length 0x800 00:09:21.198 Malloc1p0 : 5.09 503.13 1.97 0.00 0.00 253133.15 3134.33 219745.06 00:09:21.198 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x800 00:09:21.198 Malloc1p1 : 5.10 628.02 2.45 0.00 0.00 202529.35 2478.97 171419.38 00:09:21.198 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x800 length 0x800 00:09:21.198 Malloc1p1 : 5.28 509.15 1.99 0.00 0.00 249589.33 3348.03 219745.06 00:09:21.198 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x200 00:09:21.198 Malloc2p0 : 5.10 627.75 2.45 0.00 0.00 202194.23 2721.17 171419.38 00:09:21.198 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x200 length 0x200 00:09:21.198 Malloc2p0 : 5.28 508.91 1.99 0.00 0.00 249047.63 4160.11 217921.45 00:09:21.198 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x200 00:09:21.198 Malloc2p1 : 5.10 627.47 2.45 0.00 0.00 201835.14 3632.97 165948.55 00:09:21.198 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x200 length 0x200 00:09:21.198 Malloc2p1 : 5.28 508.67 1.99 0.00 0.00 248372.82 3704.21 215186.03 00:09:21.198 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x200 00:09:21.198 Malloc2p2 : 5.24 635.07 2.48 0.00 0.00 198947.26 3120.08 163213.13 00:09:21.198 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x200 length 0x200 00:09:21.198 Malloc2p2 : 5.29 508.42 1.99 0.00 0.00 247779.23 3162.82 213362.42 00:09:21.198 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x200 00:09:21.198 Malloc2p3 : 5.24 634.78 2.48 0.00 0.00 198578.44 2464.72 161389.52 00:09:21.198 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x200 length 0x200 00:09:21.198 Malloc2p3 : 5.29 508.17 1.99 0.00 0.00 247285.15 3405.02 213362.42 00:09:21.198 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x200 00:09:21.198 Malloc2p4 : 5.25 634.49 2.48 0.00 0.00 198258.20 2464.72 163213.13 00:09:21.198 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x200 length 0x200 00:09:21.198 Malloc2p4 : 5.29 507.93 1.98 0.00 0.00 246745.61 4274.09 210627.01 00:09:21.198 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x200 00:09:21.198 Malloc2p5 : 5.25 634.20 2.48 0.00 0.00 197933.07 2835.14 163213.13 00:09:21.198 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x200 length 0x200 00:09:21.198 Malloc2p5 : 5.29 507.68 1.98 0.00 0.00 246034.26 3590.23 206979.78 00:09:21.198 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x200 00:09:21.198 Malloc2p6 : 5.25 633.91 2.48 0.00 0.00 197596.85 3675.71 160477.72 00:09:21.198 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x200 length 0x200 00:09:21.198 Malloc2p6 : 5.30 507.43 1.98 0.00 0.00 245505.67 3077.34 206979.78 00:09:21.198 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x200 00:09:21.198 Malloc2p7 : 5.25 633.63 2.48 0.00 0.00 197154.78 3048.85 157742.30 00:09:21.198 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x200 length 0x200 00:09:21.198 Malloc2p7 : 5.30 507.19 1.98 0.00 0.00 244987.24 3063.10 200597.15 00:09:21.198 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x1000 00:09:21.198 TestPT : 5.25 614.05 2.40 0.00 0.00 202170.87 10428.77 157742.30 00:09:21.198 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x1000 length 0x1000 00:09:21.198 TestPT : 5.27 486.11 1.90 0.00 0.00 254874.31 63826.37 202420.76 00:09:21.198 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x2000 00:09:21.198 raid0 : 5.26 633.19 2.47 0.00 0.00 196341.08 2578.70 150447.86 00:09:21.198 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x2000 length 0x2000 00:09:21.198 raid0 : 5.30 506.94 1.98 0.00 0.00 243825.18 3604.48 185096.46 00:09:21.198 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x2000 00:09:21.198 concat0 : 5.26 632.91 2.47 0.00 0.00 196025.68 2350.75 152271.47 00:09:21.198 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x2000 length 0x2000 00:09:21.198 concat0 : 5.30 506.69 1.98 0.00 0.00 243152.64 3433.52 185096.46 00:09:21.198 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x1000 00:09:21.198 raid1 : 5.26 632.62 2.47 0.00 0.00 195705.00 3020.35 158654.11 00:09:21.198 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x1000 length 0x1000 00:09:21.198 raid1 : 5.31 506.44 1.98 0.00 0.00 242507.51 4131.62 190567.29 00:09:21.198 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x0 length 0x4e2 00:09:21.198 AIO0 : 5.26 632.41 2.47 0.00 0.00 195323.72 1638.40 166860.35 00:09:21.198 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:21.198 Verification LBA range: start 0x4e2 length 0x4e2 00:09:21.198 AIO0 : 5.31 506.25 1.98 0.00 0.00 241812.44 1716.76 194214.51 00:09:21.198 =================================================================================================================== 00:09:21.198 Total : 19215.27 75.06 0.00 0.00 209187.54 548.51 406665.13 00:09:21.198 00:09:21.198 real 0m6.505s 00:09:21.198 user 0m12.050s 00:09:21.198 sys 0m0.420s 00:09:21.198 13:35:08 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:21.198 13:35:08 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:21.198 ************************************ 00:09:21.198 END TEST bdev_verify 00:09:21.198 ************************************ 00:09:21.198 13:35:08 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:21.198 13:35:08 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:21.198 13:35:08 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:21.198 13:35:08 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:21.198 13:35:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:21.198 ************************************ 00:09:21.198 START TEST bdev_verify_big_io 00:09:21.198 ************************************ 00:09:21.198 13:35:08 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:21.198 [2024-07-12 13:35:09.028392] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:09:21.198 [2024-07-12 13:35:09.028460] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid416019 ] 00:09:21.198 [2024-07-12 13:35:09.158136] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:21.198 [2024-07-12 13:35:09.260680] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:21.198 [2024-07-12 13:35:09.260686] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:21.198 [2024-07-12 13:35:09.418600] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:21.198 [2024-07-12 13:35:09.418661] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:21.198 [2024-07-12 13:35:09.418676] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:21.198 [2024-07-12 13:35:09.426610] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:21.198 [2024-07-12 13:35:09.426637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:21.198 [2024-07-12 13:35:09.434621] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:21.198 [2024-07-12 13:35:09.434644] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:21.198 [2024-07-12 13:35:09.511993] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:21.198 [2024-07-12 13:35:09.512045] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:21.198 [2024-07-12 13:35:09.512062] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x250bf00 00:09:21.198 [2024-07-12 13:35:09.512075] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:21.198 [2024-07-12 13:35:09.513520] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:21.198 [2024-07-12 13:35:09.513549] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:21.198 [2024-07-12 13:35:09.673109] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:21.198 [2024-07-12 13:35:09.674307] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:21.198 [2024-07-12 13:35:09.676060] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.677211] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.678991] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.680095] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.681564] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.683084] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.684014] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.685534] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.686478] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.687962] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.688882] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.690393] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.691310] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.692778] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:21.199 [2024-07-12 13:35:09.717881] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:21.199 [2024-07-12 13:35:09.719935] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:21.199 Running I/O for 5 seconds... 00:09:29.313 00:09:29.313 Latency(us) 00:09:29.313 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:29.313 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x100 00:09:29.313 Malloc0 : 6.06 147.88 9.24 0.00 0.00 848178.98 858.38 2450932.42 00:09:29.313 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x100 length 0x100 00:09:29.313 Malloc0 : 7.29 193.09 12.07 0.00 0.00 451351.20 1104.14 798741.37 00:09:29.313 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x80 00:09:29.313 Malloc1p0 : 6.80 35.30 2.21 0.00 0.00 3295318.33 1488.81 5631309.02 00:09:29.313 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x80 length 0x80 00:09:29.313 Malloc1p0 : 7.03 60.86 3.80 0.00 0.00 1923463.57 2963.37 3763931.94 00:09:29.313 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x80 00:09:29.313 Malloc1p1 : 6.80 35.29 2.21 0.00 0.00 3182813.47 1567.17 5427064.65 00:09:29.313 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x80 length 0x80 00:09:29.313 Malloc1p1 : 7.31 28.46 1.78 0.00 0.00 4093279.11 4673.00 6535819.80 00:09:29.313 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x20 00:09:29.313 Malloc2p0 : 6.33 22.74 1.42 0.00 0.00 1226600.19 655.36 2071621.45 00:09:29.313 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x20 length 0x20 00:09:29.313 Malloc2p0 : 6.73 16.64 1.04 0.00 0.00 1709614.80 772.90 2815654.51 00:09:29.313 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x20 00:09:29.313 Malloc2p1 : 6.33 22.74 1.42 0.00 0.00 1214525.48 637.55 2042443.69 00:09:29.313 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x20 length 0x20 00:09:29.313 Malloc2p1 : 6.73 16.64 1.04 0.00 0.00 1690859.89 815.64 2771887.86 00:09:29.313 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x20 00:09:29.313 Malloc2p2 : 6.33 22.73 1.42 0.00 0.00 1202375.79 637.55 2013265.92 00:09:29.313 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x20 length 0x20 00:09:29.313 Malloc2p2 : 6.73 16.63 1.04 0.00 0.00 1674016.71 769.34 2742710.09 00:09:29.313 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x20 00:09:29.313 Malloc2p3 : 6.42 24.93 1.56 0.00 0.00 1107105.74 651.80 1984088.15 00:09:29.313 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x20 length 0x20 00:09:29.313 Malloc2p3 : 6.74 16.63 1.04 0.00 0.00 1655804.99 780.02 2713532.33 00:09:29.313 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x20 00:09:29.313 Malloc2p4 : 6.42 24.93 1.56 0.00 0.00 1095619.91 633.99 1954910.39 00:09:29.313 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x20 length 0x20 00:09:29.313 Malloc2p4 : 6.74 16.62 1.04 0.00 0.00 1637923.35 772.90 2684354.56 00:09:29.313 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x20 00:09:29.313 Malloc2p5 : 6.42 24.92 1.56 0.00 0.00 1085663.38 644.67 1925732.62 00:09:29.313 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x20 length 0x20 00:09:29.313 Malloc2p5 : 6.74 16.62 1.04 0.00 0.00 1620138.31 772.90 2640587.91 00:09:29.313 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x20 00:09:29.313 Malloc2p6 : 6.42 24.92 1.56 0.00 0.00 1074837.67 644.67 1896554.85 00:09:29.313 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x20 length 0x20 00:09:29.313 Malloc2p6 : 6.74 16.61 1.04 0.00 0.00 1602387.30 794.27 2611410.14 00:09:29.313 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x20 00:09:29.313 Malloc2p7 : 6.42 24.91 1.56 0.00 0.00 1064526.50 626.87 1860082.64 00:09:29.313 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x20 length 0x20 00:09:29.313 Malloc2p7 : 7.03 18.20 1.14 0.00 0.00 1471438.54 783.58 2582232.38 00:09:29.313 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x100 00:09:29.313 TestPT : 6.96 36.77 2.30 0.00 0.00 2719096.06 1488.81 4931042.62 00:09:29.313 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x100 length 0x100 00:09:29.313 TestPT : 7.29 28.82 1.80 0.00 0.00 3618594.53 253481.85 4318309.51 00:09:29.313 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x200 00:09:29.313 raid0 : 7.02 38.73 2.42 0.00 0.00 2496234.41 1602.78 4755976.01 00:09:29.313 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x200 length 0x200 00:09:29.313 raid0 : 7.29 30.74 1.92 0.00 0.00 3254080.97 1994.57 5514597.95 00:09:29.313 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x200 00:09:29.313 concat0 : 7.02 45.56 2.85 0.00 0.00 2084772.36 1538.67 4551731.65 00:09:29.313 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x200 length 0x200 00:09:29.313 concat0 : 7.22 33.26 2.08 0.00 0.00 2924474.81 1966.08 5281175.82 00:09:29.313 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x100 00:09:29.313 raid1 : 7.03 59.21 3.70 0.00 0.00 1583026.14 2037.31 4376665.04 00:09:29.313 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x100 length 0x100 00:09:29.313 raid1 : 7.22 37.68 2.35 0.00 0.00 2482765.55 2521.71 5018575.92 00:09:29.313 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x0 length 0x4e 00:09:29.313 AIO0 : 7.03 56.34 3.52 0.00 0.00 986277.94 477.27 2801065.63 00:09:29.313 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:09:29.313 Verification LBA range: start 0x4e length 0x4e 00:09:29.313 AIO0 : 7.22 32.68 2.04 0.00 0.00 1674504.49 997.29 3938998.54 00:09:29.313 =================================================================================================================== 00:09:29.313 Total : 1228.10 76.76 0.00 0.00 1648461.09 477.27 6535819.80 00:09:29.313 00:09:29.313 real 0m8.568s 00:09:29.313 user 0m16.132s 00:09:29.313 sys 0m0.443s 00:09:29.313 13:35:17 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:29.313 13:35:17 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:09:29.313 ************************************ 00:09:29.313 END TEST bdev_verify_big_io 00:09:29.313 ************************************ 00:09:29.313 13:35:17 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:29.313 13:35:17 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:29.313 13:35:17 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:29.313 13:35:17 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:29.313 13:35:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:29.313 ************************************ 00:09:29.313 START TEST bdev_write_zeroes 00:09:29.313 ************************************ 00:09:29.313 13:35:17 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:29.313 [2024-07-12 13:35:17.688848] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:09:29.313 [2024-07-12 13:35:17.688915] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid417255 ] 00:09:29.313 [2024-07-12 13:35:17.820719] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:29.572 [2024-07-12 13:35:17.928184] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:29.572 [2024-07-12 13:35:18.088260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:29.572 [2024-07-12 13:35:18.088326] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:29.572 [2024-07-12 13:35:18.088341] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:29.572 [2024-07-12 13:35:18.096262] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:29.572 [2024-07-12 13:35:18.096289] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:29.572 [2024-07-12 13:35:18.104272] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:29.572 [2024-07-12 13:35:18.104297] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:29.830 [2024-07-12 13:35:18.181606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:29.830 [2024-07-12 13:35:18.181663] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:29.830 [2024-07-12 13:35:18.181681] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21739c0 00:09:29.830 [2024-07-12 13:35:18.181694] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:29.830 [2024-07-12 13:35:18.183147] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:29.830 [2024-07-12 13:35:18.183177] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:29.830 Running I/O for 1 seconds... 00:09:31.206 00:09:31.206 Latency(us) 00:09:31.206 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:31.206 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc0 : 1.05 4993.86 19.51 0.00 0.00 25606.46 648.24 42626.89 00:09:31.206 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc1p0 : 1.05 4986.80 19.48 0.00 0.00 25600.11 904.68 41715.09 00:09:31.206 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc1p1 : 1.05 4979.76 19.45 0.00 0.00 25579.73 897.56 40803.28 00:09:31.206 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc2p0 : 1.06 4972.71 19.42 0.00 0.00 25561.27 894.00 39891.48 00:09:31.206 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc2p1 : 1.06 4965.74 19.40 0.00 0.00 25548.53 894.00 39207.62 00:09:31.206 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc2p2 : 1.06 4958.81 19.37 0.00 0.00 25527.16 894.00 38295.82 00:09:31.206 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc2p3 : 1.06 4951.80 19.34 0.00 0.00 25507.54 894.00 37384.01 00:09:31.206 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc2p4 : 1.06 4944.72 19.32 0.00 0.00 25488.44 890.43 36472.21 00:09:31.206 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc2p5 : 1.06 4937.80 19.29 0.00 0.00 25469.90 894.00 35560.40 00:09:31.206 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc2p6 : 1.06 4930.87 19.26 0.00 0.00 25450.12 890.43 34648.60 00:09:31.206 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 Malloc2p7 : 1.07 4923.94 19.23 0.00 0.00 25432.44 890.43 33736.79 00:09:31.206 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 TestPT : 1.07 4917.11 19.21 0.00 0.00 25412.78 926.05 32824.99 00:09:31.206 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 raid0 : 1.07 4909.21 19.18 0.00 0.00 25377.83 1624.15 31229.33 00:09:31.206 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 concat0 : 1.07 4901.48 19.15 0.00 0.00 25322.25 1602.78 29633.67 00:09:31.206 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 raid1 : 1.07 4891.84 19.11 0.00 0.00 25262.85 2564.45 27126.21 00:09:31.206 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:31.206 AIO0 : 1.07 4885.96 19.09 0.00 0.00 25173.87 1032.90 26100.42 00:09:31.206 =================================================================================================================== 00:09:31.206 Total : 79052.41 308.80 0.00 0.00 25457.58 648.24 42626.89 00:09:31.465 00:09:31.465 real 0m2.255s 00:09:31.465 user 0m1.836s 00:09:31.465 sys 0m0.358s 00:09:31.465 13:35:19 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:31.465 13:35:19 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:31.465 ************************************ 00:09:31.465 END TEST bdev_write_zeroes 00:09:31.465 ************************************ 00:09:31.465 13:35:19 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:31.465 13:35:19 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:31.465 13:35:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:31.465 13:35:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:31.465 13:35:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:31.465 ************************************ 00:09:31.465 START TEST bdev_json_nonenclosed 00:09:31.465 ************************************ 00:09:31.465 13:35:19 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:31.465 [2024-07-12 13:35:20.028296] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:09:31.465 [2024-07-12 13:35:20.028366] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid417508 ] 00:09:31.723 [2024-07-12 13:35:20.162552] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:31.724 [2024-07-12 13:35:20.265619] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:31.724 [2024-07-12 13:35:20.265696] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:31.724 [2024-07-12 13:35:20.265718] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:31.724 [2024-07-12 13:35:20.265732] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:31.982 00:09:31.982 real 0m0.408s 00:09:31.982 user 0m0.237s 00:09:31.982 sys 0m0.168s 00:09:31.982 13:35:20 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:09:31.982 13:35:20 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:31.982 13:35:20 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:31.982 ************************************ 00:09:31.982 END TEST bdev_json_nonenclosed 00:09:31.982 ************************************ 00:09:31.982 13:35:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:31.982 13:35:20 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:09:31.982 13:35:20 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:31.982 13:35:20 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:31.982 13:35:20 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:31.982 13:35:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:31.982 ************************************ 00:09:31.982 START TEST bdev_json_nonarray 00:09:31.982 ************************************ 00:09:31.982 13:35:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:31.982 [2024-07-12 13:35:20.522518] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:09:31.982 [2024-07-12 13:35:20.522581] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid417641 ] 00:09:32.240 [2024-07-12 13:35:20.649999] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:32.240 [2024-07-12 13:35:20.750759] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:32.240 [2024-07-12 13:35:20.750837] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:32.240 [2024-07-12 13:35:20.750859] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:32.241 [2024-07-12 13:35:20.750872] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:32.499 00:09:32.499 real 0m0.399s 00:09:32.499 user 0m0.240s 00:09:32.499 sys 0m0.157s 00:09:32.499 13:35:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:09:32.499 13:35:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:32.499 13:35:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:32.499 ************************************ 00:09:32.499 END TEST bdev_json_nonarray 00:09:32.499 ************************************ 00:09:32.499 13:35:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:32.499 13:35:20 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:09:32.499 13:35:20 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:32.499 13:35:20 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:32.499 13:35:20 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:32.499 13:35:20 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:32.499 13:35:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.499 ************************************ 00:09:32.499 START TEST bdev_qos 00:09:32.499 ************************************ 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=417672 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 417672' 00:09:32.499 Process qos testing pid: 417672 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 417672 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 417672 ']' 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:32.499 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:32.499 13:35:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:32.499 [2024-07-12 13:35:21.010803] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:09:32.499 [2024-07-12 13:35:21.010873] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid417672 ] 00:09:32.758 [2024-07-12 13:35:21.148216] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:32.758 [2024-07-12 13:35:21.280494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:33.694 13:35:21 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:33.694 13:35:21 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:09:33.694 13:35:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:33.694 13:35:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:33.694 13:35:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:33.694 Malloc_0 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:33.694 [ 00:09:33.694 { 00:09:33.694 "name": "Malloc_0", 00:09:33.694 "aliases": [ 00:09:33.694 "415b2776-5cd2-4bb4-8904-257dee174798" 00:09:33.694 ], 00:09:33.694 "product_name": "Malloc disk", 00:09:33.694 "block_size": 512, 00:09:33.694 "num_blocks": 262144, 00:09:33.694 "uuid": "415b2776-5cd2-4bb4-8904-257dee174798", 00:09:33.694 "assigned_rate_limits": { 00:09:33.694 "rw_ios_per_sec": 0, 00:09:33.694 "rw_mbytes_per_sec": 0, 00:09:33.694 "r_mbytes_per_sec": 0, 00:09:33.694 "w_mbytes_per_sec": 0 00:09:33.694 }, 00:09:33.694 "claimed": false, 00:09:33.694 "zoned": false, 00:09:33.694 "supported_io_types": { 00:09:33.694 "read": true, 00:09:33.694 "write": true, 00:09:33.694 "unmap": true, 00:09:33.694 "flush": true, 00:09:33.694 "reset": true, 00:09:33.694 "nvme_admin": false, 00:09:33.694 "nvme_io": false, 00:09:33.694 "nvme_io_md": false, 00:09:33.694 "write_zeroes": true, 00:09:33.694 "zcopy": true, 00:09:33.694 "get_zone_info": false, 00:09:33.694 "zone_management": false, 00:09:33.694 "zone_append": false, 00:09:33.694 "compare": false, 00:09:33.694 "compare_and_write": false, 00:09:33.694 "abort": true, 00:09:33.694 "seek_hole": false, 00:09:33.694 "seek_data": false, 00:09:33.694 "copy": true, 00:09:33.694 "nvme_iov_md": false 00:09:33.694 }, 00:09:33.694 "memory_domains": [ 00:09:33.694 { 00:09:33.694 "dma_device_id": "system", 00:09:33.694 "dma_device_type": 1 00:09:33.694 }, 00:09:33.694 { 00:09:33.694 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:33.694 "dma_device_type": 2 00:09:33.694 } 00:09:33.694 ], 00:09:33.694 "driver_specific": {} 00:09:33.694 } 00:09:33.694 ] 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:33.694 Null_1 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:33.694 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:33.694 [ 00:09:33.694 { 00:09:33.694 "name": "Null_1", 00:09:33.694 "aliases": [ 00:09:33.694 "e28f5e04-51ff-4b18-a06b-9295aedab763" 00:09:33.694 ], 00:09:33.694 "product_name": "Null disk", 00:09:33.694 "block_size": 512, 00:09:33.694 "num_blocks": 262144, 00:09:33.694 "uuid": "e28f5e04-51ff-4b18-a06b-9295aedab763", 00:09:33.694 "assigned_rate_limits": { 00:09:33.694 "rw_ios_per_sec": 0, 00:09:33.694 "rw_mbytes_per_sec": 0, 00:09:33.694 "r_mbytes_per_sec": 0, 00:09:33.694 "w_mbytes_per_sec": 0 00:09:33.694 }, 00:09:33.694 "claimed": false, 00:09:33.695 "zoned": false, 00:09:33.695 "supported_io_types": { 00:09:33.695 "read": true, 00:09:33.695 "write": true, 00:09:33.695 "unmap": false, 00:09:33.695 "flush": false, 00:09:33.695 "reset": true, 00:09:33.695 "nvme_admin": false, 00:09:33.695 "nvme_io": false, 00:09:33.695 "nvme_io_md": false, 00:09:33.695 "write_zeroes": true, 00:09:33.695 "zcopy": false, 00:09:33.695 "get_zone_info": false, 00:09:33.695 "zone_management": false, 00:09:33.695 "zone_append": false, 00:09:33.695 "compare": false, 00:09:33.695 "compare_and_write": false, 00:09:33.695 "abort": true, 00:09:33.695 "seek_hole": false, 00:09:33.695 "seek_data": false, 00:09:33.695 "copy": false, 00:09:33.695 "nvme_iov_md": false 00:09:33.695 }, 00:09:33.695 "driver_specific": {} 00:09:33.695 } 00:09:33.695 ] 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:33.695 13:35:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:33.957 Running I/O for 60 seconds... 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 48752.83 195011.31 0.00 0.00 196608.00 0.00 0.00 ' 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=48752.83 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 48752 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=48752 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=12000 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 12000 -gt 1000 ']' 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 12000 Malloc_0 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:39.231 13:35:27 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:39.232 13:35:27 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:39.232 13:35:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 12000 IOPS Malloc_0 00:09:39.232 13:35:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:39.232 13:35:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:39.232 13:35:27 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:39.232 ************************************ 00:09:39.232 START TEST bdev_qos_iops 00:09:39.232 ************************************ 00:09:39.232 13:35:27 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 12000 IOPS Malloc_0 00:09:39.232 13:35:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=12000 00:09:39.232 13:35:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:39.232 13:35:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:09:39.232 13:35:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:39.232 13:35:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:39.232 13:35:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:39.232 13:35:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:39.232 13:35:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:39.232 13:35:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 11999.80 47999.22 0.00 0.00 49440.00 0.00 0.00 ' 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=11999.80 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 11999 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=11999 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=10800 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=13200 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 11999 -lt 10800 ']' 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 11999 -gt 13200 ']' 00:09:44.502 00:09:44.502 real 0m5.319s 00:09:44.502 user 0m0.123s 00:09:44.502 sys 0m0.046s 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:44.502 13:35:32 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:09:44.502 ************************************ 00:09:44.502 END TEST bdev_qos_iops 00:09:44.502 ************************************ 00:09:44.502 13:35:32 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:44.502 13:35:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:09:44.502 13:35:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:44.502 13:35:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:44.502 13:35:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:44.502 13:35:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:44.502 13:35:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:44.502 13:35:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:49.771 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 14903.51 59614.05 0.00 0.00 61440.00 0.00 0.00 ' 00:09:49.771 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:49.771 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:49.771 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:49.771 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=61440.00 00:09:49.771 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 61440 00:09:49.771 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=61440 00:09:49.771 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=6 00:09:49.771 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 6 -lt 2 ']' 00:09:49.772 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 6 Null_1 00:09:49.772 13:35:38 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:49.772 13:35:38 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:49.772 13:35:38 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:49.772 13:35:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 6 BANDWIDTH Null_1 00:09:49.772 13:35:38 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:49.772 13:35:38 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:49.772 13:35:38 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:49.772 ************************************ 00:09:49.772 START TEST bdev_qos_bw 00:09:49.772 ************************************ 00:09:49.772 13:35:38 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 6 BANDWIDTH Null_1 00:09:49.772 13:35:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=6 00:09:49.772 13:35:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:49.772 13:35:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:09:49.772 13:35:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:49.772 13:35:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:49.772 13:35:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:49.772 13:35:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:49.772 13:35:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:49.772 13:35:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 1535.24 6140.95 0.00 0.00 6280.00 0.00 0.00 ' 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=6280.00 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 6280 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=6280 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=6144 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=5529 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=6758 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 6280 -lt 5529 ']' 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 6280 -gt 6758 ']' 00:09:55.041 00:09:55.041 real 0m5.281s 00:09:55.041 user 0m0.120s 00:09:55.041 sys 0m0.047s 00:09:55.041 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:09:55.042 ************************************ 00:09:55.042 END TEST bdev_qos_bw 00:09:55.042 ************************************ 00:09:55.042 13:35:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:55.042 13:35:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:09:55.042 13:35:43 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.042 13:35:43 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.042 13:35:43 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.042 13:35:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:09:55.042 13:35:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:55.042 13:35:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:55.042 13:35:43 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.042 ************************************ 00:09:55.042 START TEST bdev_qos_ro_bw 00:09:55.042 ************************************ 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:55.042 13:35:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.86 2047.44 0.00 0.00 2060.00 0.00 0.00 ' 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:10:00.314 00:10:00.314 real 0m5.195s 00:10:00.314 user 0m0.120s 00:10:00.314 sys 0m0.049s 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:00.314 13:35:48 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:00.314 ************************************ 00:10:00.314 END TEST bdev_qos_ro_bw 00:10:00.314 ************************************ 00:10:00.314 13:35:48 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:00.314 13:35:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:00.314 13:35:48 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:00.314 13:35:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:00.884 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:00.884 13:35:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:10:00.884 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:00.884 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:01.143 00:10:01.143 Latency(us) 00:10:01.144 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:01.144 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:01.144 Malloc_0 : 26.91 16332.63 63.80 0.00 0.00 15526.30 2578.70 503316.48 00:10:01.144 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:01.144 Null_1 : 27.11 15519.36 60.62 0.00 0.00 16432.24 1011.53 198773.54 00:10:01.144 =================================================================================================================== 00:10:01.144 Total : 31851.99 124.42 0.00 0.00 15969.38 1011.53 503316.48 00:10:01.144 0 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 417672 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 417672 ']' 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 417672 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 417672 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 417672' 00:10:01.144 killing process with pid 417672 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 417672 00:10:01.144 Received shutdown signal, test time was about 27.173354 seconds 00:10:01.144 00:10:01.144 Latency(us) 00:10:01.144 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:01.144 =================================================================================================================== 00:10:01.144 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:01.144 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 417672 00:10:01.404 13:35:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:10:01.404 00:10:01.404 real 0m28.876s 00:10:01.404 user 0m29.681s 00:10:01.404 sys 0m0.974s 00:10:01.404 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:01.404 13:35:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:01.404 ************************************ 00:10:01.404 END TEST bdev_qos 00:10:01.404 ************************************ 00:10:01.404 13:35:49 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:01.404 13:35:49 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:01.404 13:35:49 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:01.404 13:35:49 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:01.404 13:35:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:01.404 ************************************ 00:10:01.404 START TEST bdev_qd_sampling 00:10:01.404 ************************************ 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=421490 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 421490' 00:10:01.404 Process bdev QD sampling period testing pid: 421490 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 421490 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 421490 ']' 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:01.404 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:01.404 13:35:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:01.404 [2024-07-12 13:35:49.972629] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:10:01.404 [2024-07-12 13:35:49.972696] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid421490 ] 00:10:01.664 [2024-07-12 13:35:50.104705] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:01.664 [2024-07-12 13:35:50.205495] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:01.664 [2024-07-12 13:35:50.205499] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:02.600 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:02.600 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:02.601 Malloc_QD 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:02.601 [ 00:10:02.601 { 00:10:02.601 "name": "Malloc_QD", 00:10:02.601 "aliases": [ 00:10:02.601 "e096025b-0666-4c20-b179-0b22169967d8" 00:10:02.601 ], 00:10:02.601 "product_name": "Malloc disk", 00:10:02.601 "block_size": 512, 00:10:02.601 "num_blocks": 262144, 00:10:02.601 "uuid": "e096025b-0666-4c20-b179-0b22169967d8", 00:10:02.601 "assigned_rate_limits": { 00:10:02.601 "rw_ios_per_sec": 0, 00:10:02.601 "rw_mbytes_per_sec": 0, 00:10:02.601 "r_mbytes_per_sec": 0, 00:10:02.601 "w_mbytes_per_sec": 0 00:10:02.601 }, 00:10:02.601 "claimed": false, 00:10:02.601 "zoned": false, 00:10:02.601 "supported_io_types": { 00:10:02.601 "read": true, 00:10:02.601 "write": true, 00:10:02.601 "unmap": true, 00:10:02.601 "flush": true, 00:10:02.601 "reset": true, 00:10:02.601 "nvme_admin": false, 00:10:02.601 "nvme_io": false, 00:10:02.601 "nvme_io_md": false, 00:10:02.601 "write_zeroes": true, 00:10:02.601 "zcopy": true, 00:10:02.601 "get_zone_info": false, 00:10:02.601 "zone_management": false, 00:10:02.601 "zone_append": false, 00:10:02.601 "compare": false, 00:10:02.601 "compare_and_write": false, 00:10:02.601 "abort": true, 00:10:02.601 "seek_hole": false, 00:10:02.601 "seek_data": false, 00:10:02.601 "copy": true, 00:10:02.601 "nvme_iov_md": false 00:10:02.601 }, 00:10:02.601 "memory_domains": [ 00:10:02.601 { 00:10:02.601 "dma_device_id": "system", 00:10:02.601 "dma_device_type": 1 00:10:02.601 }, 00:10:02.601 { 00:10:02.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:02.601 "dma_device_type": 2 00:10:02.601 } 00:10:02.601 ], 00:10:02.601 "driver_specific": {} 00:10:02.601 } 00:10:02.601 ] 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:10:02.601 13:35:50 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:02.601 Running I/O for 5 seconds... 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.505 13:35:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:10:04.505 "tick_rate": 2300000000, 00:10:04.505 "ticks": 4765885644626300, 00:10:04.505 "bdevs": [ 00:10:04.506 { 00:10:04.506 "name": "Malloc_QD", 00:10:04.506 "bytes_read": 709931520, 00:10:04.506 "num_read_ops": 173316, 00:10:04.506 "bytes_written": 0, 00:10:04.506 "num_write_ops": 0, 00:10:04.506 "bytes_unmapped": 0, 00:10:04.506 "num_unmap_ops": 0, 00:10:04.506 "bytes_copied": 0, 00:10:04.506 "num_copy_ops": 0, 00:10:04.506 "read_latency_ticks": 2239865825680, 00:10:04.506 "max_read_latency_ticks": 17501252, 00:10:04.506 "min_read_latency_ticks": 290862, 00:10:04.506 "write_latency_ticks": 0, 00:10:04.506 "max_write_latency_ticks": 0, 00:10:04.506 "min_write_latency_ticks": 0, 00:10:04.506 "unmap_latency_ticks": 0, 00:10:04.506 "max_unmap_latency_ticks": 0, 00:10:04.506 "min_unmap_latency_ticks": 0, 00:10:04.506 "copy_latency_ticks": 0, 00:10:04.506 "max_copy_latency_ticks": 0, 00:10:04.506 "min_copy_latency_ticks": 0, 00:10:04.506 "io_error": {}, 00:10:04.506 "queue_depth_polling_period": 10, 00:10:04.506 "queue_depth": 512, 00:10:04.506 "io_time": 30, 00:10:04.506 "weighted_io_time": 15360 00:10:04.506 } 00:10:04.506 ] 00:10:04.506 }' 00:10:04.506 13:35:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.506 00:10:04.506 Latency(us) 00:10:04.506 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:04.506 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:04.506 Malloc_QD : 1.98 50152.66 195.91 0.00 0.00 5091.37 1638.40 5470.83 00:10:04.506 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:04.506 Malloc_QD : 1.98 40827.99 159.48 0.00 0.00 6253.77 1474.56 7636.37 00:10:04.506 =================================================================================================================== 00:10:04.506 Total : 90980.65 355.39 0.00 0.00 5613.13 1474.56 7636.37 00:10:04.506 0 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 421490 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 421490 ']' 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 421490 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:04.506 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 421490 00:10:04.765 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:04.765 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:04.765 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 421490' 00:10:04.765 killing process with pid 421490 00:10:04.765 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 421490 00:10:04.765 Received shutdown signal, test time was about 2.062722 seconds 00:10:04.765 00:10:04.765 Latency(us) 00:10:04.765 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:04.765 =================================================================================================================== 00:10:04.765 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:04.765 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 421490 00:10:04.765 13:35:53 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:10:04.765 00:10:04.765 real 0m3.421s 00:10:04.765 user 0m6.653s 00:10:04.765 sys 0m0.425s 00:10:04.765 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:04.765 13:35:53 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.765 ************************************ 00:10:04.765 END TEST bdev_qd_sampling 00:10:04.765 ************************************ 00:10:05.024 13:35:53 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:05.024 13:35:53 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:10:05.024 13:35:53 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:05.024 13:35:53 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:05.024 13:35:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:05.024 ************************************ 00:10:05.024 START TEST bdev_error 00:10:05.024 ************************************ 00:10:05.024 13:35:53 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:10:05.024 13:35:53 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:10:05.024 13:35:53 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:10:05.024 13:35:53 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:10:05.024 13:35:53 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=422016 00:10:05.024 13:35:53 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 422016' 00:10:05.024 Process error testing pid: 422016 00:10:05.024 13:35:53 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:05.024 13:35:53 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 422016 00:10:05.024 13:35:53 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 422016 ']' 00:10:05.024 13:35:53 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:05.024 13:35:53 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:05.024 13:35:53 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:05.024 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:05.024 13:35:53 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:05.024 13:35:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:05.024 [2024-07-12 13:35:53.475086] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:10:05.024 [2024-07-12 13:35:53.475142] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid422016 ] 00:10:05.024 [2024-07-12 13:35:53.594319] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:05.283 [2024-07-12 13:35:53.728099] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:05.851 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:05.851 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:05.851 13:35:54 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:05.851 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.851 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.112 Dev_1 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.112 [ 00:10:06.112 { 00:10:06.112 "name": "Dev_1", 00:10:06.112 "aliases": [ 00:10:06.112 "75be2ef8-db08-455a-82a2-3a43dffbaaeb" 00:10:06.112 ], 00:10:06.112 "product_name": "Malloc disk", 00:10:06.112 "block_size": 512, 00:10:06.112 "num_blocks": 262144, 00:10:06.112 "uuid": "75be2ef8-db08-455a-82a2-3a43dffbaaeb", 00:10:06.112 "assigned_rate_limits": { 00:10:06.112 "rw_ios_per_sec": 0, 00:10:06.112 "rw_mbytes_per_sec": 0, 00:10:06.112 "r_mbytes_per_sec": 0, 00:10:06.112 "w_mbytes_per_sec": 0 00:10:06.112 }, 00:10:06.112 "claimed": false, 00:10:06.112 "zoned": false, 00:10:06.112 "supported_io_types": { 00:10:06.112 "read": true, 00:10:06.112 "write": true, 00:10:06.112 "unmap": true, 00:10:06.112 "flush": true, 00:10:06.112 "reset": true, 00:10:06.112 "nvme_admin": false, 00:10:06.112 "nvme_io": false, 00:10:06.112 "nvme_io_md": false, 00:10:06.112 "write_zeroes": true, 00:10:06.112 "zcopy": true, 00:10:06.112 "get_zone_info": false, 00:10:06.112 "zone_management": false, 00:10:06.112 "zone_append": false, 00:10:06.112 "compare": false, 00:10:06.112 "compare_and_write": false, 00:10:06.112 "abort": true, 00:10:06.112 "seek_hole": false, 00:10:06.112 "seek_data": false, 00:10:06.112 "copy": true, 00:10:06.112 "nvme_iov_md": false 00:10:06.112 }, 00:10:06.112 "memory_domains": [ 00:10:06.112 { 00:10:06.112 "dma_device_id": "system", 00:10:06.112 "dma_device_type": 1 00:10:06.112 }, 00:10:06.112 { 00:10:06.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:06.112 "dma_device_type": 2 00:10:06.112 } 00:10:06.112 ], 00:10:06.112 "driver_specific": {} 00:10:06.112 } 00:10:06.112 ] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:06.112 13:35:54 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.112 true 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.112 Dev_2 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.112 [ 00:10:06.112 { 00:10:06.112 "name": "Dev_2", 00:10:06.112 "aliases": [ 00:10:06.112 "baa20591-8d0a-426c-b982-ac2b24f3ec8b" 00:10:06.112 ], 00:10:06.112 "product_name": "Malloc disk", 00:10:06.112 "block_size": 512, 00:10:06.112 "num_blocks": 262144, 00:10:06.112 "uuid": "baa20591-8d0a-426c-b982-ac2b24f3ec8b", 00:10:06.112 "assigned_rate_limits": { 00:10:06.112 "rw_ios_per_sec": 0, 00:10:06.112 "rw_mbytes_per_sec": 0, 00:10:06.112 "r_mbytes_per_sec": 0, 00:10:06.112 "w_mbytes_per_sec": 0 00:10:06.112 }, 00:10:06.112 "claimed": false, 00:10:06.112 "zoned": false, 00:10:06.112 "supported_io_types": { 00:10:06.112 "read": true, 00:10:06.112 "write": true, 00:10:06.112 "unmap": true, 00:10:06.112 "flush": true, 00:10:06.112 "reset": true, 00:10:06.112 "nvme_admin": false, 00:10:06.112 "nvme_io": false, 00:10:06.112 "nvme_io_md": false, 00:10:06.112 "write_zeroes": true, 00:10:06.112 "zcopy": true, 00:10:06.112 "get_zone_info": false, 00:10:06.112 "zone_management": false, 00:10:06.112 "zone_append": false, 00:10:06.112 "compare": false, 00:10:06.112 "compare_and_write": false, 00:10:06.112 "abort": true, 00:10:06.112 "seek_hole": false, 00:10:06.112 "seek_data": false, 00:10:06.112 "copy": true, 00:10:06.112 "nvme_iov_md": false 00:10:06.112 }, 00:10:06.112 "memory_domains": [ 00:10:06.112 { 00:10:06.112 "dma_device_id": "system", 00:10:06.112 "dma_device_type": 1 00:10:06.112 }, 00:10:06.112 { 00:10:06.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:06.112 "dma_device_type": 2 00:10:06.112 } 00:10:06.112 ], 00:10:06.112 "driver_specific": {} 00:10:06.112 } 00:10:06.112 ] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:06.112 13:35:54 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:06.112 13:35:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.112 13:35:54 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:10:06.112 13:35:54 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:06.373 Running I/O for 5 seconds... 00:10:07.311 13:35:55 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 422016 00:10:07.311 13:35:55 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 422016' 00:10:07.311 Process is existed as continue on error is set. Pid: 422016 00:10:07.311 13:35:55 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:07.311 13:35:55 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:07.311 13:35:55 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:07.311 13:35:55 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:07.311 13:35:55 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:07.311 13:35:55 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:07.311 13:35:55 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:07.311 13:35:55 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:07.311 13:35:55 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:10:07.311 Timeout while waiting for response: 00:10:07.311 00:10:07.311 00:10:11.505 00:10:11.505 Latency(us) 00:10:11.505 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:11.505 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:11.505 EE_Dev_1 : 0.89 29209.77 114.10 5.59 0.00 542.95 166.51 861.94 00:10:11.505 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:11.505 Dev_2 : 5.00 63341.51 247.43 0.00 0.00 248.10 83.70 28038.01 00:10:11.505 =================================================================================================================== 00:10:11.505 Total : 92551.29 361.53 5.59 0.00 270.57 83.70 28038.01 00:10:12.442 13:36:00 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 422016 00:10:12.442 13:36:00 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 422016 ']' 00:10:12.442 13:36:00 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 422016 00:10:12.442 13:36:00 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:10:12.442 13:36:00 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:12.442 13:36:00 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 422016 00:10:12.442 13:36:00 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:12.442 13:36:00 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:12.442 13:36:00 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 422016' 00:10:12.442 killing process with pid 422016 00:10:12.442 13:36:00 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 422016 00:10:12.442 Received shutdown signal, test time was about 5.000000 seconds 00:10:12.442 00:10:12.442 Latency(us) 00:10:12.442 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:12.442 =================================================================================================================== 00:10:12.442 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:12.442 13:36:00 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 422016 00:10:12.702 13:36:01 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=423134 00:10:12.702 13:36:01 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 423134' 00:10:12.702 Process error testing pid: 423134 00:10:12.702 13:36:01 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:12.702 13:36:01 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 423134 00:10:12.702 13:36:01 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 423134 ']' 00:10:12.702 13:36:01 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:12.702 13:36:01 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:12.702 13:36:01 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:12.702 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:12.702 13:36:01 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:12.702 13:36:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:12.702 [2024-07-12 13:36:01.160277] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:10:12.702 [2024-07-12 13:36:01.160351] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid423134 ] 00:10:12.961 [2024-07-12 13:36:01.294552] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:12.961 [2024-07-12 13:36:01.409666] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:13.529 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:13.529 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:13.529 13:36:02 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:13.529 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.529 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.789 Dev_1 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.789 [ 00:10:13.789 { 00:10:13.789 "name": "Dev_1", 00:10:13.789 "aliases": [ 00:10:13.789 "ab1f9a08-a786-4c97-9cdb-c9cef386081e" 00:10:13.789 ], 00:10:13.789 "product_name": "Malloc disk", 00:10:13.789 "block_size": 512, 00:10:13.789 "num_blocks": 262144, 00:10:13.789 "uuid": "ab1f9a08-a786-4c97-9cdb-c9cef386081e", 00:10:13.789 "assigned_rate_limits": { 00:10:13.789 "rw_ios_per_sec": 0, 00:10:13.789 "rw_mbytes_per_sec": 0, 00:10:13.789 "r_mbytes_per_sec": 0, 00:10:13.789 "w_mbytes_per_sec": 0 00:10:13.789 }, 00:10:13.789 "claimed": false, 00:10:13.789 "zoned": false, 00:10:13.789 "supported_io_types": { 00:10:13.789 "read": true, 00:10:13.789 "write": true, 00:10:13.789 "unmap": true, 00:10:13.789 "flush": true, 00:10:13.789 "reset": true, 00:10:13.789 "nvme_admin": false, 00:10:13.789 "nvme_io": false, 00:10:13.789 "nvme_io_md": false, 00:10:13.789 "write_zeroes": true, 00:10:13.789 "zcopy": true, 00:10:13.789 "get_zone_info": false, 00:10:13.789 "zone_management": false, 00:10:13.789 "zone_append": false, 00:10:13.789 "compare": false, 00:10:13.789 "compare_and_write": false, 00:10:13.789 "abort": true, 00:10:13.789 "seek_hole": false, 00:10:13.789 "seek_data": false, 00:10:13.789 "copy": true, 00:10:13.789 "nvme_iov_md": false 00:10:13.789 }, 00:10:13.789 "memory_domains": [ 00:10:13.789 { 00:10:13.789 "dma_device_id": "system", 00:10:13.789 "dma_device_type": 1 00:10:13.789 }, 00:10:13.789 { 00:10:13.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:13.789 "dma_device_type": 2 00:10:13.789 } 00:10:13.789 ], 00:10:13.789 "driver_specific": {} 00:10:13.789 } 00:10:13.789 ] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:13.789 13:36:02 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.789 true 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.789 Dev_2 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.789 [ 00:10:13.789 { 00:10:13.789 "name": "Dev_2", 00:10:13.789 "aliases": [ 00:10:13.789 "f6a6b6a4-4a6f-4978-b971-d6da96c6fccc" 00:10:13.789 ], 00:10:13.789 "product_name": "Malloc disk", 00:10:13.789 "block_size": 512, 00:10:13.789 "num_blocks": 262144, 00:10:13.789 "uuid": "f6a6b6a4-4a6f-4978-b971-d6da96c6fccc", 00:10:13.789 "assigned_rate_limits": { 00:10:13.789 "rw_ios_per_sec": 0, 00:10:13.789 "rw_mbytes_per_sec": 0, 00:10:13.789 "r_mbytes_per_sec": 0, 00:10:13.789 "w_mbytes_per_sec": 0 00:10:13.789 }, 00:10:13.789 "claimed": false, 00:10:13.789 "zoned": false, 00:10:13.789 "supported_io_types": { 00:10:13.789 "read": true, 00:10:13.789 "write": true, 00:10:13.789 "unmap": true, 00:10:13.789 "flush": true, 00:10:13.789 "reset": true, 00:10:13.789 "nvme_admin": false, 00:10:13.789 "nvme_io": false, 00:10:13.789 "nvme_io_md": false, 00:10:13.789 "write_zeroes": true, 00:10:13.789 "zcopy": true, 00:10:13.789 "get_zone_info": false, 00:10:13.789 "zone_management": false, 00:10:13.789 "zone_append": false, 00:10:13.789 "compare": false, 00:10:13.789 "compare_and_write": false, 00:10:13.789 "abort": true, 00:10:13.789 "seek_hole": false, 00:10:13.789 "seek_data": false, 00:10:13.789 "copy": true, 00:10:13.789 "nvme_iov_md": false 00:10:13.789 }, 00:10:13.789 "memory_domains": [ 00:10:13.789 { 00:10:13.789 "dma_device_id": "system", 00:10:13.789 "dma_device_type": 1 00:10:13.789 }, 00:10:13.789 { 00:10:13.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:13.789 "dma_device_type": 2 00:10:13.789 } 00:10:13.789 ], 00:10:13.789 "driver_specific": {} 00:10:13.789 } 00:10:13.789 ] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:13.789 13:36:02 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.789 13:36:02 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 423134 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:13.789 13:36:02 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 423134 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:13.789 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 423134 00:10:14.049 Running I/O for 5 seconds... 00:10:14.049 task offset: 218408 on job bdev=EE_Dev_1 fails 00:10:14.049 00:10:14.049 Latency(us) 00:10:14.049 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:14.049 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:14.049 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:14.049 EE_Dev_1 : 0.00 23554.60 92.01 5353.32 0.00 461.39 163.84 819.20 00:10:14.049 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:14.049 Dev_2 : 0.00 14388.49 56.21 0.00 0.00 826.38 162.06 1538.67 00:10:14.049 =================================================================================================================== 00:10:14.049 Total : 37943.09 148.22 5353.32 0.00 659.35 162.06 1538.67 00:10:14.049 [2024-07-12 13:36:02.389098] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:14.049 request: 00:10:14.049 { 00:10:14.049 "method": "perform_tests", 00:10:14.049 "req_id": 1 00:10:14.049 } 00:10:14.049 Got JSON-RPC error response 00:10:14.049 response: 00:10:14.049 { 00:10:14.049 "code": -32603, 00:10:14.049 "message": "bdevperf failed with error Operation not permitted" 00:10:14.049 } 00:10:14.310 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:14.310 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:14.310 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:14.310 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:14.310 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:14.310 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:14.310 00:10:14.310 real 0m9.332s 00:10:14.310 user 0m9.652s 00:10:14.310 sys 0m1.002s 00:10:14.310 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:14.310 13:36:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.310 ************************************ 00:10:14.310 END TEST bdev_error 00:10:14.310 ************************************ 00:10:14.310 13:36:02 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:14.310 13:36:02 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:14.310 13:36:02 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:14.310 13:36:02 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:14.310 13:36:02 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:14.310 ************************************ 00:10:14.310 START TEST bdev_stat 00:10:14.310 ************************************ 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=423397 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 423397' 00:10:14.310 Process Bdev IO statistics testing pid: 423397 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 423397 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 423397 ']' 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:14.310 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:14.310 13:36:02 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:14.568 [2024-07-12 13:36:02.895546] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:10:14.568 [2024-07-12 13:36:02.895616] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid423397 ] 00:10:14.568 [2024-07-12 13:36:03.023657] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:14.568 [2024-07-12 13:36:03.131324] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:14.568 [2024-07-12 13:36:03.131329] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:15.502 Malloc_STAT 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:15.502 [ 00:10:15.502 { 00:10:15.502 "name": "Malloc_STAT", 00:10:15.502 "aliases": [ 00:10:15.502 "5947de97-81e8-41d3-998f-4582e1a6eb83" 00:10:15.502 ], 00:10:15.502 "product_name": "Malloc disk", 00:10:15.502 "block_size": 512, 00:10:15.502 "num_blocks": 262144, 00:10:15.502 "uuid": "5947de97-81e8-41d3-998f-4582e1a6eb83", 00:10:15.502 "assigned_rate_limits": { 00:10:15.502 "rw_ios_per_sec": 0, 00:10:15.502 "rw_mbytes_per_sec": 0, 00:10:15.502 "r_mbytes_per_sec": 0, 00:10:15.502 "w_mbytes_per_sec": 0 00:10:15.502 }, 00:10:15.502 "claimed": false, 00:10:15.502 "zoned": false, 00:10:15.502 "supported_io_types": { 00:10:15.502 "read": true, 00:10:15.502 "write": true, 00:10:15.502 "unmap": true, 00:10:15.502 "flush": true, 00:10:15.502 "reset": true, 00:10:15.502 "nvme_admin": false, 00:10:15.502 "nvme_io": false, 00:10:15.502 "nvme_io_md": false, 00:10:15.502 "write_zeroes": true, 00:10:15.502 "zcopy": true, 00:10:15.502 "get_zone_info": false, 00:10:15.502 "zone_management": false, 00:10:15.502 "zone_append": false, 00:10:15.502 "compare": false, 00:10:15.502 "compare_and_write": false, 00:10:15.502 "abort": true, 00:10:15.502 "seek_hole": false, 00:10:15.502 "seek_data": false, 00:10:15.502 "copy": true, 00:10:15.502 "nvme_iov_md": false 00:10:15.502 }, 00:10:15.502 "memory_domains": [ 00:10:15.502 { 00:10:15.502 "dma_device_id": "system", 00:10:15.502 "dma_device_type": 1 00:10:15.502 }, 00:10:15.502 { 00:10:15.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:15.502 "dma_device_type": 2 00:10:15.502 } 00:10:15.502 ], 00:10:15.502 "driver_specific": {} 00:10:15.502 } 00:10:15.502 ] 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:15.502 13:36:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:15.502 Running I/O for 10 seconds... 00:10:17.406 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:17.406 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:17.406 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:17.406 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:17.406 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:17.407 "tick_rate": 2300000000, 00:10:17.407 "ticks": 4765915408652702, 00:10:17.407 "bdevs": [ 00:10:17.407 { 00:10:17.407 "name": "Malloc_STAT", 00:10:17.407 "bytes_read": 710980096, 00:10:17.407 "num_read_ops": 173572, 00:10:17.407 "bytes_written": 0, 00:10:17.407 "num_write_ops": 0, 00:10:17.407 "bytes_unmapped": 0, 00:10:17.407 "num_unmap_ops": 0, 00:10:17.407 "bytes_copied": 0, 00:10:17.407 "num_copy_ops": 0, 00:10:17.407 "read_latency_ticks": 2232324291720, 00:10:17.407 "max_read_latency_ticks": 17287622, 00:10:17.407 "min_read_latency_ticks": 243554, 00:10:17.407 "write_latency_ticks": 0, 00:10:17.407 "max_write_latency_ticks": 0, 00:10:17.407 "min_write_latency_ticks": 0, 00:10:17.407 "unmap_latency_ticks": 0, 00:10:17.407 "max_unmap_latency_ticks": 0, 00:10:17.407 "min_unmap_latency_ticks": 0, 00:10:17.407 "copy_latency_ticks": 0, 00:10:17.407 "max_copy_latency_ticks": 0, 00:10:17.407 "min_copy_latency_ticks": 0, 00:10:17.407 "io_error": {} 00:10:17.407 } 00:10:17.407 ] 00:10:17.407 }' 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=173572 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.407 13:36:05 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:17.666 "tick_rate": 2300000000, 00:10:17.666 "ticks": 4765915573046586, 00:10:17.666 "name": "Malloc_STAT", 00:10:17.666 "channels": [ 00:10:17.666 { 00:10:17.666 "thread_id": 2, 00:10:17.666 "bytes_read": 406847488, 00:10:17.666 "num_read_ops": 99328, 00:10:17.666 "bytes_written": 0, 00:10:17.666 "num_write_ops": 0, 00:10:17.666 "bytes_unmapped": 0, 00:10:17.666 "num_unmap_ops": 0, 00:10:17.666 "bytes_copied": 0, 00:10:17.666 "num_copy_ops": 0, 00:10:17.666 "read_latency_ticks": 1156967503814, 00:10:17.666 "max_read_latency_ticks": 12659306, 00:10:17.666 "min_read_latency_ticks": 8124992, 00:10:17.666 "write_latency_ticks": 0, 00:10:17.666 "max_write_latency_ticks": 0, 00:10:17.666 "min_write_latency_ticks": 0, 00:10:17.666 "unmap_latency_ticks": 0, 00:10:17.666 "max_unmap_latency_ticks": 0, 00:10:17.666 "min_unmap_latency_ticks": 0, 00:10:17.666 "copy_latency_ticks": 0, 00:10:17.666 "max_copy_latency_ticks": 0, 00:10:17.666 "min_copy_latency_ticks": 0 00:10:17.666 }, 00:10:17.666 { 00:10:17.666 "thread_id": 3, 00:10:17.666 "bytes_read": 330301440, 00:10:17.666 "num_read_ops": 80640, 00:10:17.666 "bytes_written": 0, 00:10:17.666 "num_write_ops": 0, 00:10:17.666 "bytes_unmapped": 0, 00:10:17.666 "num_unmap_ops": 0, 00:10:17.666 "bytes_copied": 0, 00:10:17.666 "num_copy_ops": 0, 00:10:17.666 "read_latency_ticks": 1157781637504, 00:10:17.666 "max_read_latency_ticks": 17287622, 00:10:17.666 "min_read_latency_ticks": 9290704, 00:10:17.666 "write_latency_ticks": 0, 00:10:17.666 "max_write_latency_ticks": 0, 00:10:17.666 "min_write_latency_ticks": 0, 00:10:17.666 "unmap_latency_ticks": 0, 00:10:17.666 "max_unmap_latency_ticks": 0, 00:10:17.666 "min_unmap_latency_ticks": 0, 00:10:17.666 "copy_latency_ticks": 0, 00:10:17.666 "max_copy_latency_ticks": 0, 00:10:17.666 "min_copy_latency_ticks": 0 00:10:17.666 } 00:10:17.666 ] 00:10:17.666 }' 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=99328 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=99328 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=80640 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=179968 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:17.666 "tick_rate": 2300000000, 00:10:17.666 "ticks": 4765915983142744, 00:10:17.666 "bdevs": [ 00:10:17.666 { 00:10:17.666 "name": "Malloc_STAT", 00:10:17.666 "bytes_read": 804303360, 00:10:17.666 "num_read_ops": 196356, 00:10:17.666 "bytes_written": 0, 00:10:17.666 "num_write_ops": 0, 00:10:17.666 "bytes_unmapped": 0, 00:10:17.666 "num_unmap_ops": 0, 00:10:17.666 "bytes_copied": 0, 00:10:17.666 "num_copy_ops": 0, 00:10:17.666 "read_latency_ticks": 2525798240450, 00:10:17.666 "max_read_latency_ticks": 17287622, 00:10:17.666 "min_read_latency_ticks": 243554, 00:10:17.666 "write_latency_ticks": 0, 00:10:17.666 "max_write_latency_ticks": 0, 00:10:17.666 "min_write_latency_ticks": 0, 00:10:17.666 "unmap_latency_ticks": 0, 00:10:17.666 "max_unmap_latency_ticks": 0, 00:10:17.666 "min_unmap_latency_ticks": 0, 00:10:17.666 "copy_latency_ticks": 0, 00:10:17.666 "max_copy_latency_ticks": 0, 00:10:17.666 "min_copy_latency_ticks": 0, 00:10:17.666 "io_error": {} 00:10:17.666 } 00:10:17.666 ] 00:10:17.666 }' 00:10:17.666 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=196356 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 179968 -lt 173572 ']' 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 179968 -gt 196356 ']' 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:17.926 00:10:17.926 Latency(us) 00:10:17.926 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:17.926 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:17.926 Malloc_STAT : 2.25 50440.01 197.03 0.00 0.00 5063.61 1374.83 5527.82 00:10:17.926 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:17.926 Malloc_STAT : 2.26 40979.62 160.08 0.00 0.00 6231.79 1196.74 7522.39 00:10:17.926 =================================================================================================================== 00:10:17.926 Total : 91419.63 357.11 0.00 0.00 5587.48 1196.74 7522.39 00:10:17.926 0 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 423397 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 423397 ']' 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 423397 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 423397 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 423397' 00:10:17.926 killing process with pid 423397 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 423397 00:10:17.926 Received shutdown signal, test time was about 2.332459 seconds 00:10:17.926 00:10:17.926 Latency(us) 00:10:17.926 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:17.926 =================================================================================================================== 00:10:17.926 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:17.926 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 423397 00:10:18.186 13:36:06 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:18.186 00:10:18.186 real 0m3.725s 00:10:18.186 user 0m7.512s 00:10:18.186 sys 0m0.477s 00:10:18.186 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:18.186 13:36:06 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:18.186 ************************************ 00:10:18.186 END TEST bdev_stat 00:10:18.186 ************************************ 00:10:18.186 13:36:06 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:18.186 13:36:06 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:18.186 13:36:06 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:18.186 13:36:06 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:18.186 13:36:06 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:18.186 13:36:06 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:18.186 13:36:06 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:18.186 13:36:06 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:18.186 13:36:06 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:18.186 13:36:06 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:18.186 13:36:06 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:18.186 00:10:18.186 real 1m59.473s 00:10:18.186 user 7m15.666s 00:10:18.186 sys 0m23.669s 00:10:18.186 13:36:06 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:18.186 13:36:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:18.186 ************************************ 00:10:18.186 END TEST blockdev_general 00:10:18.186 ************************************ 00:10:18.186 13:36:06 -- common/autotest_common.sh@1142 -- # return 0 00:10:18.186 13:36:06 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:18.186 13:36:06 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:18.186 13:36:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:18.186 13:36:06 -- common/autotest_common.sh@10 -- # set +x 00:10:18.186 ************************************ 00:10:18.186 START TEST bdev_raid 00:10:18.186 ************************************ 00:10:18.186 13:36:06 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:18.445 * Looking for test storage... 00:10:18.445 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:18.445 13:36:06 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:10:18.445 13:36:06 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:18.445 13:36:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:18.445 13:36:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:18.445 13:36:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:18.445 ************************************ 00:10:18.445 START TEST raid_function_test_raid0 00:10:18.445 ************************************ 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=424393 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 424393' 00:10:18.445 Process raid pid: 424393 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 424393 /var/tmp/spdk-raid.sock 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 424393 ']' 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:18.445 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:18.445 13:36:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:18.445 [2024-07-12 13:36:06.928134] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:10:18.445 [2024-07-12 13:36:06.928201] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:18.704 [2024-07-12 13:36:07.057732] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:18.704 [2024-07-12 13:36:07.154499] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:18.704 [2024-07-12 13:36:07.217441] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:18.704 [2024-07-12 13:36:07.217477] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:19.271 13:36:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:19.271 13:36:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:10:19.271 13:36:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:10:19.272 13:36:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:10:19.272 13:36:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:19.272 13:36:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:10:19.272 13:36:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:19.531 [2024-07-12 13:36:08.066202] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:19.531 [2024-07-12 13:36:08.067353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:19.531 [2024-07-12 13:36:08.067410] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27614a0 00:10:19.531 [2024-07-12 13:36:08.067421] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:19.531 [2024-07-12 13:36:08.067676] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2764df0 00:10:19.531 [2024-07-12 13:36:08.067791] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27614a0 00:10:19.531 [2024-07-12 13:36:08.067800] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x27614a0 00:10:19.531 [2024-07-12 13:36:08.067903] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:19.531 Base_1 00:10:19.531 Base_2 00:10:19.531 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:19.531 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:19.531 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:19.791 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:20.358 [2024-07-12 13:36:08.832269] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2764c00 00:10:20.358 /dev/nbd0 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:20.358 1+0 records in 00:10:20.358 1+0 records out 00:10:20.358 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253268 s, 16.2 MB/s 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:20.358 13:36:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:20.616 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:20.616 { 00:10:20.616 "nbd_device": "/dev/nbd0", 00:10:20.616 "bdev_name": "raid" 00:10:20.616 } 00:10:20.616 ]' 00:10:20.616 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:20.616 { 00:10:20.616 "nbd_device": "/dev/nbd0", 00:10:20.616 "bdev_name": "raid" 00:10:20.616 } 00:10:20.616 ]' 00:10:20.616 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:20.616 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:20.616 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:20.616 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:20.874 4096+0 records in 00:10:20.874 4096+0 records out 00:10:20.874 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0293163 s, 71.5 MB/s 00:10:20.874 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:21.132 4096+0 records in 00:10:21.132 4096+0 records out 00:10:21.132 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.318147 s, 6.6 MB/s 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:21.132 128+0 records in 00:10:21.132 128+0 records out 00:10:21.132 65536 bytes (66 kB, 64 KiB) copied, 0.000375294 s, 175 MB/s 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:21.132 2035+0 records in 00:10:21.132 2035+0 records out 00:10:21.132 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0119272 s, 87.4 MB/s 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:21.132 456+0 records in 00:10:21.132 456+0 records out 00:10:21.132 233472 bytes (233 kB, 228 KiB) copied, 0.00164894 s, 142 MB/s 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:21.132 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:21.390 [2024-07-12 13:36:09.882759] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:21.390 13:36:09 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 424393 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 424393 ']' 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 424393 00:10:21.649 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:10:21.908 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:21.908 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 424393 00:10:21.908 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:21.908 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:21.908 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 424393' 00:10:21.908 killing process with pid 424393 00:10:21.908 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 424393 00:10:21.908 [2024-07-12 13:36:10.273728] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:21.908 [2024-07-12 13:36:10.273800] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:21.908 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 424393 00:10:21.908 [2024-07-12 13:36:10.273844] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:21.908 [2024-07-12 13:36:10.273858] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27614a0 name raid, state offline 00:10:21.908 [2024-07-12 13:36:10.290343] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:22.167 13:36:10 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:10:22.167 00:10:22.167 real 0m3.625s 00:10:22.167 user 0m4.842s 00:10:22.167 sys 0m1.312s 00:10:22.167 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:22.167 13:36:10 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:22.167 ************************************ 00:10:22.167 END TEST raid_function_test_raid0 00:10:22.167 ************************************ 00:10:22.167 13:36:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:22.167 13:36:10 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:10:22.167 13:36:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:22.167 13:36:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:22.167 13:36:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:22.167 ************************************ 00:10:22.167 START TEST raid_function_test_concat 00:10:22.167 ************************************ 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=424996 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 424996' 00:10:22.167 Process raid pid: 424996 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 424996 /var/tmp/spdk-raid.sock 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 424996 ']' 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:22.167 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:22.167 13:36:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:22.167 [2024-07-12 13:36:10.645764] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:10:22.167 [2024-07-12 13:36:10.645831] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:22.426 [2024-07-12 13:36:10.777775] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:22.426 [2024-07-12 13:36:10.884537] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:22.426 [2024-07-12 13:36:10.950869] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:22.426 [2024-07-12 13:36:10.950900] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:22.993 13:36:11 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:22.993 13:36:11 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:10:22.993 13:36:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:10:23.252 13:36:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:10:23.252 13:36:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:23.252 13:36:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:10:23.252 13:36:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:23.252 [2024-07-12 13:36:11.828834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:23.252 [2024-07-12 13:36:11.829992] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:23.252 [2024-07-12 13:36:11.830046] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x272d4a0 00:10:23.252 [2024-07-12 13:36:11.830057] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:23.252 [2024-07-12 13:36:11.830302] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2730e30 00:10:23.252 [2024-07-12 13:36:11.830412] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x272d4a0 00:10:23.252 [2024-07-12 13:36:11.830423] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x272d4a0 00:10:23.252 [2024-07-12 13:36:11.830522] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:23.252 Base_1 00:10:23.252 Base_2 00:10:23.511 13:36:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:23.511 13:36:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:23.511 13:36:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:23.769 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:24.028 [2024-07-12 13:36:12.582886] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2730c40 00:10:24.028 /dev/nbd0 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:24.287 1+0 records in 00:10:24.287 1+0 records out 00:10:24.287 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258236 s, 15.9 MB/s 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:24.287 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:24.545 { 00:10:24.545 "nbd_device": "/dev/nbd0", 00:10:24.545 "bdev_name": "raid" 00:10:24.545 } 00:10:24.545 ]' 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:24.545 { 00:10:24.545 "nbd_device": "/dev/nbd0", 00:10:24.545 "bdev_name": "raid" 00:10:24.545 } 00:10:24.545 ]' 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:24.545 13:36:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:24.545 4096+0 records in 00:10:24.545 4096+0 records out 00:10:24.545 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0298201 s, 70.3 MB/s 00:10:24.545 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:24.803 4096+0 records in 00:10:24.803 4096+0 records out 00:10:24.803 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.30305 s, 6.9 MB/s 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:24.803 128+0 records in 00:10:24.803 128+0 records out 00:10:24.803 65536 bytes (66 kB, 64 KiB) copied, 0.000848604 s, 77.2 MB/s 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:24.803 2035+0 records in 00:10:24.803 2035+0 records out 00:10:24.803 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0117989 s, 88.3 MB/s 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:24.803 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:25.061 456+0 records in 00:10:25.061 456+0 records out 00:10:25.061 233472 bytes (233 kB, 228 KiB) copied, 0.00272322 s, 85.7 MB/s 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:25.061 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:25.320 [2024-07-12 13:36:13.667676] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:25.320 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:25.578 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:25.578 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:25.578 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:25.578 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:25.579 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:25.579 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:25.579 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:10:25.579 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:10:25.579 13:36:13 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:25.579 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:10:25.579 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:25.579 13:36:13 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 424996 00:10:25.579 13:36:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 424996 ']' 00:10:25.579 13:36:13 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 424996 00:10:25.579 13:36:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:10:25.579 13:36:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:25.579 13:36:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 424996 00:10:25.579 13:36:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:25.579 13:36:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:25.579 13:36:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 424996' 00:10:25.579 killing process with pid 424996 00:10:25.579 13:36:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 424996 00:10:25.579 [2024-07-12 13:36:14.047299] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:25.579 [2024-07-12 13:36:14.047363] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:25.579 [2024-07-12 13:36:14.047405] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:25.579 [2024-07-12 13:36:14.047420] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x272d4a0 name raid, state offline 00:10:25.579 13:36:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 424996 00:10:25.579 [2024-07-12 13:36:14.064245] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:25.837 13:36:14 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:10:25.837 00:10:25.837 real 0m3.699s 00:10:25.838 user 0m5.022s 00:10:25.838 sys 0m1.289s 00:10:25.838 13:36:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:25.838 13:36:14 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:25.838 ************************************ 00:10:25.838 END TEST raid_function_test_concat 00:10:25.838 ************************************ 00:10:25.838 13:36:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:25.838 13:36:14 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:10:25.838 13:36:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:25.838 13:36:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:25.838 13:36:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:25.838 ************************************ 00:10:25.838 START TEST raid0_resize_test 00:10:25.838 ************************************ 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=425529 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 425529' 00:10:25.838 Process raid pid: 425529 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 425529 /var/tmp/spdk-raid.sock 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 425529 ']' 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:25.838 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:25.838 13:36:14 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:26.095 [2024-07-12 13:36:14.432028] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:10:26.095 [2024-07-12 13:36:14.432096] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:26.095 [2024-07-12 13:36:14.555674] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:26.095 [2024-07-12 13:36:14.652848] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:26.353 [2024-07-12 13:36:14.709894] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:26.353 [2024-07-12 13:36:14.709941] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:26.917 13:36:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:26.917 13:36:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:10:26.917 13:36:15 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:10:27.175 Base_1 00:10:27.175 13:36:15 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:10:27.433 Base_2 00:10:27.433 13:36:15 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:10:27.691 [2024-07-12 13:36:16.102643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:27.691 [2024-07-12 13:36:16.104091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:27.691 [2024-07-12 13:36:16.104158] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1362650 00:10:27.691 [2024-07-12 13:36:16.104169] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:27.691 [2024-07-12 13:36:16.104371] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1362930 00:10:27.691 [2024-07-12 13:36:16.104461] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1362650 00:10:27.691 [2024-07-12 13:36:16.104471] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x1362650 00:10:27.691 [2024-07-12 13:36:16.104575] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:27.691 13:36:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:10:27.948 [2024-07-12 13:36:16.355278] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:27.948 [2024-07-12 13:36:16.355302] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:10:27.948 true 00:10:27.948 13:36:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:27.948 13:36:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:10:28.207 [2024-07-12 13:36:16.604087] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:28.207 13:36:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:10:28.207 13:36:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:10:28.207 13:36:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:10:28.207 13:36:16 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:10:28.775 [2024-07-12 13:36:17.113252] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:28.775 [2024-07-12 13:36:17.113276] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:10:28.775 [2024-07-12 13:36:17.113303] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:10:28.775 true 00:10:28.775 13:36:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:28.775 13:36:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:10:29.034 [2024-07-12 13:36:17.370100] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:29.034 13:36:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:10:29.034 13:36:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:10:29.034 13:36:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:10:29.034 13:36:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 425529 00:10:29.034 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 425529 ']' 00:10:29.034 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 425529 00:10:29.034 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:10:29.034 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:29.034 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 425529 00:10:29.035 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:29.035 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:29.035 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 425529' 00:10:29.035 killing process with pid 425529 00:10:29.035 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 425529 00:10:29.035 [2024-07-12 13:36:17.438458] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:29.035 [2024-07-12 13:36:17.438518] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:29.035 [2024-07-12 13:36:17.438560] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:29.035 [2024-07-12 13:36:17.438572] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1362650 name Raid, state offline 00:10:29.035 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 425529 00:10:29.035 [2024-07-12 13:36:17.439922] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:29.294 13:36:17 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:10:29.294 00:10:29.294 real 0m3.266s 00:10:29.294 user 0m5.123s 00:10:29.294 sys 0m0.693s 00:10:29.294 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:29.294 13:36:17 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:29.294 ************************************ 00:10:29.294 END TEST raid0_resize_test 00:10:29.294 ************************************ 00:10:29.294 13:36:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:29.294 13:36:17 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:29.294 13:36:17 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:29.294 13:36:17 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:10:29.294 13:36:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:29.294 13:36:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:29.294 13:36:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:29.294 ************************************ 00:10:29.294 START TEST raid_state_function_test 00:10:29.294 ************************************ 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:29.294 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=426001 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 426001' 00:10:29.295 Process raid pid: 426001 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 426001 /var/tmp/spdk-raid.sock 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 426001 ']' 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:29.295 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:29.295 13:36:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:29.295 [2024-07-12 13:36:17.784094] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:10:29.295 [2024-07-12 13:36:17.784176] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:29.554 [2024-07-12 13:36:17.928340] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:29.554 [2024-07-12 13:36:18.040122] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:29.554 [2024-07-12 13:36:18.103447] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.554 [2024-07-12 13:36:18.103474] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.813 13:36:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:29.813 13:36:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:29.813 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:30.382 [2024-07-12 13:36:18.730414] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:30.382 [2024-07-12 13:36:18.730458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:30.382 [2024-07-12 13:36:18.730468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:30.382 [2024-07-12 13:36:18.730480] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:30.382 13:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:30.642 13:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:30.642 "name": "Existed_Raid", 00:10:30.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.642 "strip_size_kb": 64, 00:10:30.642 "state": "configuring", 00:10:30.642 "raid_level": "raid0", 00:10:30.642 "superblock": false, 00:10:30.642 "num_base_bdevs": 2, 00:10:30.642 "num_base_bdevs_discovered": 0, 00:10:30.642 "num_base_bdevs_operational": 2, 00:10:30.642 "base_bdevs_list": [ 00:10:30.642 { 00:10:30.642 "name": "BaseBdev1", 00:10:30.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.642 "is_configured": false, 00:10:30.642 "data_offset": 0, 00:10:30.642 "data_size": 0 00:10:30.642 }, 00:10:30.642 { 00:10:30.642 "name": "BaseBdev2", 00:10:30.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.642 "is_configured": false, 00:10:30.642 "data_offset": 0, 00:10:30.642 "data_size": 0 00:10:30.642 } 00:10:30.642 ] 00:10:30.642 }' 00:10:30.642 13:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:30.642 13:36:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:31.210 13:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:31.210 [2024-07-12 13:36:19.769046] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:31.210 [2024-07-12 13:36:19.769079] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2503330 name Existed_Raid, state configuring 00:10:31.467 13:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:31.467 [2024-07-12 13:36:19.953546] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:31.467 [2024-07-12 13:36:19.953577] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:31.467 [2024-07-12 13:36:19.953587] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:31.467 [2024-07-12 13:36:19.953603] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:31.467 13:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:31.725 [2024-07-12 13:36:20.216188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:31.726 BaseBdev1 00:10:31.726 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:31.726 13:36:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:31.726 13:36:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:31.726 13:36:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:31.726 13:36:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:31.726 13:36:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:31.726 13:36:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:31.985 13:36:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:32.244 [ 00:10:32.244 { 00:10:32.244 "name": "BaseBdev1", 00:10:32.244 "aliases": [ 00:10:32.244 "6618f5f2-5e57-4528-8772-8cdccb8094b4" 00:10:32.244 ], 00:10:32.244 "product_name": "Malloc disk", 00:10:32.244 "block_size": 512, 00:10:32.244 "num_blocks": 65536, 00:10:32.244 "uuid": "6618f5f2-5e57-4528-8772-8cdccb8094b4", 00:10:32.244 "assigned_rate_limits": { 00:10:32.244 "rw_ios_per_sec": 0, 00:10:32.244 "rw_mbytes_per_sec": 0, 00:10:32.244 "r_mbytes_per_sec": 0, 00:10:32.244 "w_mbytes_per_sec": 0 00:10:32.244 }, 00:10:32.244 "claimed": true, 00:10:32.244 "claim_type": "exclusive_write", 00:10:32.244 "zoned": false, 00:10:32.244 "supported_io_types": { 00:10:32.244 "read": true, 00:10:32.244 "write": true, 00:10:32.244 "unmap": true, 00:10:32.244 "flush": true, 00:10:32.244 "reset": true, 00:10:32.244 "nvme_admin": false, 00:10:32.244 "nvme_io": false, 00:10:32.244 "nvme_io_md": false, 00:10:32.244 "write_zeroes": true, 00:10:32.244 "zcopy": true, 00:10:32.244 "get_zone_info": false, 00:10:32.244 "zone_management": false, 00:10:32.244 "zone_append": false, 00:10:32.244 "compare": false, 00:10:32.244 "compare_and_write": false, 00:10:32.244 "abort": true, 00:10:32.244 "seek_hole": false, 00:10:32.244 "seek_data": false, 00:10:32.244 "copy": true, 00:10:32.244 "nvme_iov_md": false 00:10:32.244 }, 00:10:32.244 "memory_domains": [ 00:10:32.244 { 00:10:32.244 "dma_device_id": "system", 00:10:32.244 "dma_device_type": 1 00:10:32.244 }, 00:10:32.244 { 00:10:32.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:32.244 "dma_device_type": 2 00:10:32.244 } 00:10:32.244 ], 00:10:32.244 "driver_specific": {} 00:10:32.244 } 00:10:32.244 ] 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:32.244 13:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:32.503 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:32.503 "name": "Existed_Raid", 00:10:32.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:32.503 "strip_size_kb": 64, 00:10:32.503 "state": "configuring", 00:10:32.503 "raid_level": "raid0", 00:10:32.503 "superblock": false, 00:10:32.503 "num_base_bdevs": 2, 00:10:32.503 "num_base_bdevs_discovered": 1, 00:10:32.503 "num_base_bdevs_operational": 2, 00:10:32.503 "base_bdevs_list": [ 00:10:32.503 { 00:10:32.503 "name": "BaseBdev1", 00:10:32.503 "uuid": "6618f5f2-5e57-4528-8772-8cdccb8094b4", 00:10:32.503 "is_configured": true, 00:10:32.503 "data_offset": 0, 00:10:32.503 "data_size": 65536 00:10:32.503 }, 00:10:32.503 { 00:10:32.503 "name": "BaseBdev2", 00:10:32.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:32.503 "is_configured": false, 00:10:32.503 "data_offset": 0, 00:10:32.503 "data_size": 0 00:10:32.503 } 00:10:32.503 ] 00:10:32.503 }' 00:10:32.503 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:32.503 13:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:33.438 13:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:33.697 [2024-07-12 13:36:22.121255] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:33.697 [2024-07-12 13:36:22.121294] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2502c20 name Existed_Raid, state configuring 00:10:33.697 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:33.956 [2024-07-12 13:36:22.365942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:33.956 [2024-07-12 13:36:22.367419] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:33.956 [2024-07-12 13:36:22.367452] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:33.956 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:33.956 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:33.956 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:33.956 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:33.956 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:33.956 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:33.957 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:33.957 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:33.957 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:33.957 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:33.957 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:33.957 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:33.957 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.957 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:34.215 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:34.216 "name": "Existed_Raid", 00:10:34.216 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:34.216 "strip_size_kb": 64, 00:10:34.216 "state": "configuring", 00:10:34.216 "raid_level": "raid0", 00:10:34.216 "superblock": false, 00:10:34.216 "num_base_bdevs": 2, 00:10:34.216 "num_base_bdevs_discovered": 1, 00:10:34.216 "num_base_bdevs_operational": 2, 00:10:34.216 "base_bdevs_list": [ 00:10:34.216 { 00:10:34.216 "name": "BaseBdev1", 00:10:34.216 "uuid": "6618f5f2-5e57-4528-8772-8cdccb8094b4", 00:10:34.216 "is_configured": true, 00:10:34.216 "data_offset": 0, 00:10:34.216 "data_size": 65536 00:10:34.216 }, 00:10:34.216 { 00:10:34.216 "name": "BaseBdev2", 00:10:34.216 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:34.216 "is_configured": false, 00:10:34.216 "data_offset": 0, 00:10:34.216 "data_size": 0 00:10:34.216 } 00:10:34.216 ] 00:10:34.216 }' 00:10:34.216 13:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:34.216 13:36:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:34.783 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:35.041 [2024-07-12 13:36:23.472377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:35.041 [2024-07-12 13:36:23.472412] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2503a10 00:10:35.041 [2024-07-12 13:36:23.472421] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:35.041 [2024-07-12 13:36:23.472610] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a73b0 00:10:35.041 [2024-07-12 13:36:23.472723] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2503a10 00:10:35.041 [2024-07-12 13:36:23.472733] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2503a10 00:10:35.041 [2024-07-12 13:36:23.472897] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:35.041 BaseBdev2 00:10:35.041 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:35.041 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:35.041 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:35.041 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:35.041 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:35.041 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:35.041 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:35.299 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:35.558 [ 00:10:35.558 { 00:10:35.558 "name": "BaseBdev2", 00:10:35.558 "aliases": [ 00:10:35.558 "dbe8c7ca-e631-4c0b-b0f9-de9c7bac7629" 00:10:35.558 ], 00:10:35.558 "product_name": "Malloc disk", 00:10:35.558 "block_size": 512, 00:10:35.558 "num_blocks": 65536, 00:10:35.558 "uuid": "dbe8c7ca-e631-4c0b-b0f9-de9c7bac7629", 00:10:35.558 "assigned_rate_limits": { 00:10:35.558 "rw_ios_per_sec": 0, 00:10:35.558 "rw_mbytes_per_sec": 0, 00:10:35.558 "r_mbytes_per_sec": 0, 00:10:35.558 "w_mbytes_per_sec": 0 00:10:35.558 }, 00:10:35.558 "claimed": true, 00:10:35.558 "claim_type": "exclusive_write", 00:10:35.558 "zoned": false, 00:10:35.558 "supported_io_types": { 00:10:35.558 "read": true, 00:10:35.558 "write": true, 00:10:35.558 "unmap": true, 00:10:35.558 "flush": true, 00:10:35.558 "reset": true, 00:10:35.559 "nvme_admin": false, 00:10:35.559 "nvme_io": false, 00:10:35.559 "nvme_io_md": false, 00:10:35.559 "write_zeroes": true, 00:10:35.559 "zcopy": true, 00:10:35.559 "get_zone_info": false, 00:10:35.559 "zone_management": false, 00:10:35.559 "zone_append": false, 00:10:35.559 "compare": false, 00:10:35.559 "compare_and_write": false, 00:10:35.559 "abort": true, 00:10:35.559 "seek_hole": false, 00:10:35.559 "seek_data": false, 00:10:35.559 "copy": true, 00:10:35.559 "nvme_iov_md": false 00:10:35.559 }, 00:10:35.559 "memory_domains": [ 00:10:35.559 { 00:10:35.559 "dma_device_id": "system", 00:10:35.559 "dma_device_type": 1 00:10:35.559 }, 00:10:35.559 { 00:10:35.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.559 "dma_device_type": 2 00:10:35.559 } 00:10:35.559 ], 00:10:35.559 "driver_specific": {} 00:10:35.559 } 00:10:35.559 ] 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.559 13:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:35.817 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:35.817 "name": "Existed_Raid", 00:10:35.817 "uuid": "091ead37-aa23-49d8-bc4a-52e0926d610b", 00:10:35.817 "strip_size_kb": 64, 00:10:35.817 "state": "online", 00:10:35.817 "raid_level": "raid0", 00:10:35.817 "superblock": false, 00:10:35.817 "num_base_bdevs": 2, 00:10:35.817 "num_base_bdevs_discovered": 2, 00:10:35.817 "num_base_bdevs_operational": 2, 00:10:35.817 "base_bdevs_list": [ 00:10:35.817 { 00:10:35.817 "name": "BaseBdev1", 00:10:35.817 "uuid": "6618f5f2-5e57-4528-8772-8cdccb8094b4", 00:10:35.817 "is_configured": true, 00:10:35.817 "data_offset": 0, 00:10:35.817 "data_size": 65536 00:10:35.817 }, 00:10:35.817 { 00:10:35.817 "name": "BaseBdev2", 00:10:35.817 "uuid": "dbe8c7ca-e631-4c0b-b0f9-de9c7bac7629", 00:10:35.817 "is_configured": true, 00:10:35.817 "data_offset": 0, 00:10:35.817 "data_size": 65536 00:10:35.817 } 00:10:35.817 ] 00:10:35.817 }' 00:10:35.817 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:35.817 13:36:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:36.385 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:36.385 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:36.385 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:36.385 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:36.385 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:36.385 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:36.385 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:36.385 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:36.644 [2024-07-12 13:36:24.968640] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:36.644 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:36.644 "name": "Existed_Raid", 00:10:36.644 "aliases": [ 00:10:36.644 "091ead37-aa23-49d8-bc4a-52e0926d610b" 00:10:36.644 ], 00:10:36.644 "product_name": "Raid Volume", 00:10:36.644 "block_size": 512, 00:10:36.644 "num_blocks": 131072, 00:10:36.644 "uuid": "091ead37-aa23-49d8-bc4a-52e0926d610b", 00:10:36.644 "assigned_rate_limits": { 00:10:36.644 "rw_ios_per_sec": 0, 00:10:36.644 "rw_mbytes_per_sec": 0, 00:10:36.644 "r_mbytes_per_sec": 0, 00:10:36.644 "w_mbytes_per_sec": 0 00:10:36.644 }, 00:10:36.644 "claimed": false, 00:10:36.644 "zoned": false, 00:10:36.644 "supported_io_types": { 00:10:36.644 "read": true, 00:10:36.644 "write": true, 00:10:36.644 "unmap": true, 00:10:36.644 "flush": true, 00:10:36.644 "reset": true, 00:10:36.644 "nvme_admin": false, 00:10:36.644 "nvme_io": false, 00:10:36.644 "nvme_io_md": false, 00:10:36.644 "write_zeroes": true, 00:10:36.644 "zcopy": false, 00:10:36.644 "get_zone_info": false, 00:10:36.644 "zone_management": false, 00:10:36.644 "zone_append": false, 00:10:36.644 "compare": false, 00:10:36.644 "compare_and_write": false, 00:10:36.644 "abort": false, 00:10:36.644 "seek_hole": false, 00:10:36.644 "seek_data": false, 00:10:36.644 "copy": false, 00:10:36.644 "nvme_iov_md": false 00:10:36.644 }, 00:10:36.644 "memory_domains": [ 00:10:36.644 { 00:10:36.644 "dma_device_id": "system", 00:10:36.644 "dma_device_type": 1 00:10:36.644 }, 00:10:36.644 { 00:10:36.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.644 "dma_device_type": 2 00:10:36.644 }, 00:10:36.644 { 00:10:36.644 "dma_device_id": "system", 00:10:36.644 "dma_device_type": 1 00:10:36.644 }, 00:10:36.644 { 00:10:36.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.644 "dma_device_type": 2 00:10:36.644 } 00:10:36.644 ], 00:10:36.644 "driver_specific": { 00:10:36.644 "raid": { 00:10:36.644 "uuid": "091ead37-aa23-49d8-bc4a-52e0926d610b", 00:10:36.644 "strip_size_kb": 64, 00:10:36.644 "state": "online", 00:10:36.644 "raid_level": "raid0", 00:10:36.644 "superblock": false, 00:10:36.644 "num_base_bdevs": 2, 00:10:36.644 "num_base_bdevs_discovered": 2, 00:10:36.644 "num_base_bdevs_operational": 2, 00:10:36.644 "base_bdevs_list": [ 00:10:36.644 { 00:10:36.644 "name": "BaseBdev1", 00:10:36.644 "uuid": "6618f5f2-5e57-4528-8772-8cdccb8094b4", 00:10:36.644 "is_configured": true, 00:10:36.644 "data_offset": 0, 00:10:36.644 "data_size": 65536 00:10:36.644 }, 00:10:36.644 { 00:10:36.644 "name": "BaseBdev2", 00:10:36.644 "uuid": "dbe8c7ca-e631-4c0b-b0f9-de9c7bac7629", 00:10:36.644 "is_configured": true, 00:10:36.644 "data_offset": 0, 00:10:36.644 "data_size": 65536 00:10:36.644 } 00:10:36.644 ] 00:10:36.644 } 00:10:36.644 } 00:10:36.644 }' 00:10:36.644 13:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:36.644 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:36.644 BaseBdev2' 00:10:36.644 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:36.645 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:36.645 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:36.903 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:36.903 "name": "BaseBdev1", 00:10:36.903 "aliases": [ 00:10:36.903 "6618f5f2-5e57-4528-8772-8cdccb8094b4" 00:10:36.903 ], 00:10:36.903 "product_name": "Malloc disk", 00:10:36.903 "block_size": 512, 00:10:36.903 "num_blocks": 65536, 00:10:36.903 "uuid": "6618f5f2-5e57-4528-8772-8cdccb8094b4", 00:10:36.903 "assigned_rate_limits": { 00:10:36.903 "rw_ios_per_sec": 0, 00:10:36.903 "rw_mbytes_per_sec": 0, 00:10:36.903 "r_mbytes_per_sec": 0, 00:10:36.903 "w_mbytes_per_sec": 0 00:10:36.903 }, 00:10:36.903 "claimed": true, 00:10:36.903 "claim_type": "exclusive_write", 00:10:36.903 "zoned": false, 00:10:36.903 "supported_io_types": { 00:10:36.903 "read": true, 00:10:36.903 "write": true, 00:10:36.903 "unmap": true, 00:10:36.903 "flush": true, 00:10:36.903 "reset": true, 00:10:36.903 "nvme_admin": false, 00:10:36.903 "nvme_io": false, 00:10:36.903 "nvme_io_md": false, 00:10:36.903 "write_zeroes": true, 00:10:36.903 "zcopy": true, 00:10:36.903 "get_zone_info": false, 00:10:36.903 "zone_management": false, 00:10:36.903 "zone_append": false, 00:10:36.903 "compare": false, 00:10:36.903 "compare_and_write": false, 00:10:36.903 "abort": true, 00:10:36.903 "seek_hole": false, 00:10:36.903 "seek_data": false, 00:10:36.903 "copy": true, 00:10:36.904 "nvme_iov_md": false 00:10:36.904 }, 00:10:36.904 "memory_domains": [ 00:10:36.904 { 00:10:36.904 "dma_device_id": "system", 00:10:36.904 "dma_device_type": 1 00:10:36.904 }, 00:10:36.904 { 00:10:36.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.904 "dma_device_type": 2 00:10:36.904 } 00:10:36.904 ], 00:10:36.904 "driver_specific": {} 00:10:36.904 }' 00:10:36.904 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:36.904 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:36.904 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:36.904 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:36.904 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:37.162 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:37.162 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:37.162 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:37.162 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:37.162 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:37.162 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:37.162 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:37.162 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:37.162 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:37.162 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:37.420 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:37.420 "name": "BaseBdev2", 00:10:37.420 "aliases": [ 00:10:37.420 "dbe8c7ca-e631-4c0b-b0f9-de9c7bac7629" 00:10:37.420 ], 00:10:37.420 "product_name": "Malloc disk", 00:10:37.420 "block_size": 512, 00:10:37.420 "num_blocks": 65536, 00:10:37.420 "uuid": "dbe8c7ca-e631-4c0b-b0f9-de9c7bac7629", 00:10:37.420 "assigned_rate_limits": { 00:10:37.420 "rw_ios_per_sec": 0, 00:10:37.421 "rw_mbytes_per_sec": 0, 00:10:37.421 "r_mbytes_per_sec": 0, 00:10:37.421 "w_mbytes_per_sec": 0 00:10:37.421 }, 00:10:37.421 "claimed": true, 00:10:37.421 "claim_type": "exclusive_write", 00:10:37.421 "zoned": false, 00:10:37.421 "supported_io_types": { 00:10:37.421 "read": true, 00:10:37.421 "write": true, 00:10:37.421 "unmap": true, 00:10:37.421 "flush": true, 00:10:37.421 "reset": true, 00:10:37.421 "nvme_admin": false, 00:10:37.421 "nvme_io": false, 00:10:37.421 "nvme_io_md": false, 00:10:37.421 "write_zeroes": true, 00:10:37.421 "zcopy": true, 00:10:37.421 "get_zone_info": false, 00:10:37.421 "zone_management": false, 00:10:37.421 "zone_append": false, 00:10:37.421 "compare": false, 00:10:37.421 "compare_and_write": false, 00:10:37.421 "abort": true, 00:10:37.421 "seek_hole": false, 00:10:37.421 "seek_data": false, 00:10:37.421 "copy": true, 00:10:37.421 "nvme_iov_md": false 00:10:37.421 }, 00:10:37.421 "memory_domains": [ 00:10:37.421 { 00:10:37.421 "dma_device_id": "system", 00:10:37.421 "dma_device_type": 1 00:10:37.421 }, 00:10:37.421 { 00:10:37.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:37.421 "dma_device_type": 2 00:10:37.421 } 00:10:37.421 ], 00:10:37.421 "driver_specific": {} 00:10:37.421 }' 00:10:37.421 13:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:37.679 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:37.679 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:37.679 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:37.679 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:37.679 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:37.679 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:37.679 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:37.679 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:37.679 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:37.938 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:37.938 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:37.938 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:37.938 [2024-07-12 13:36:26.520550] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:37.938 [2024-07-12 13:36:26.520575] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:37.938 [2024-07-12 13:36:26.520615] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:38.197 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:38.197 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:38.197 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:38.197 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:38.197 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:38.197 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:38.197 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:38.197 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:38.197 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:38.197 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:38.198 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:38.198 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:38.198 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:38.198 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:38.198 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:38.198 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:38.198 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.198 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:38.198 "name": "Existed_Raid", 00:10:38.198 "uuid": "091ead37-aa23-49d8-bc4a-52e0926d610b", 00:10:38.198 "strip_size_kb": 64, 00:10:38.198 "state": "offline", 00:10:38.198 "raid_level": "raid0", 00:10:38.198 "superblock": false, 00:10:38.198 "num_base_bdevs": 2, 00:10:38.198 "num_base_bdevs_discovered": 1, 00:10:38.198 "num_base_bdevs_operational": 1, 00:10:38.198 "base_bdevs_list": [ 00:10:38.198 { 00:10:38.198 "name": null, 00:10:38.198 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:38.198 "is_configured": false, 00:10:38.198 "data_offset": 0, 00:10:38.198 "data_size": 65536 00:10:38.198 }, 00:10:38.198 { 00:10:38.198 "name": "BaseBdev2", 00:10:38.198 "uuid": "dbe8c7ca-e631-4c0b-b0f9-de9c7bac7629", 00:10:38.198 "is_configured": true, 00:10:38.198 "data_offset": 0, 00:10:38.198 "data_size": 65536 00:10:38.198 } 00:10:38.198 ] 00:10:38.198 }' 00:10:38.198 13:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:38.198 13:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:38.765 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:38.765 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:38.765 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:38.765 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:39.024 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:39.024 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:39.024 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:39.283 [2024-07-12 13:36:27.800977] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:39.283 [2024-07-12 13:36:27.801027] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2503a10 name Existed_Raid, state offline 00:10:39.283 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:39.283 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:39.283 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:39.283 13:36:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:39.542 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:39.542 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:39.542 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:39.542 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 426001 00:10:39.543 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 426001 ']' 00:10:39.543 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 426001 00:10:39.543 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:39.543 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:39.543 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 426001 00:10:39.803 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:39.803 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:39.803 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 426001' 00:10:39.803 killing process with pid 426001 00:10:39.803 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 426001 00:10:39.803 [2024-07-12 13:36:28.126890] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:39.803 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 426001 00:10:39.803 [2024-07-12 13:36:28.127755] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:39.803 13:36:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:39.803 00:10:39.803 real 0m10.625s 00:10:39.803 user 0m19.321s 00:10:39.803 sys 0m2.003s 00:10:39.803 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:39.803 13:36:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:39.803 ************************************ 00:10:39.803 END TEST raid_state_function_test 00:10:39.803 ************************************ 00:10:39.803 13:36:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:39.803 13:36:28 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:10:39.803 13:36:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:39.803 13:36:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:39.803 13:36:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:40.064 ************************************ 00:10:40.064 START TEST raid_state_function_test_sb 00:10:40.064 ************************************ 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=427637 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 427637' 00:10:40.064 Process raid pid: 427637 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 427637 /var/tmp/spdk-raid.sock 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 427637 ']' 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:40.064 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:40.064 13:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:40.064 [2024-07-12 13:36:28.490138] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:10:40.064 [2024-07-12 13:36:28.490216] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:40.064 [2024-07-12 13:36:28.638909] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:40.323 [2024-07-12 13:36:28.751144] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:40.323 [2024-07-12 13:36:28.820764] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:40.323 [2024-07-12 13:36:28.820796] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:40.582 13:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:40.582 13:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:40.582 13:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:40.842 [2024-07-12 13:36:29.178884] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:40.842 [2024-07-12 13:36:29.178934] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:40.842 [2024-07-12 13:36:29.178946] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:40.842 [2024-07-12 13:36:29.178958] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:40.842 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:40.842 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:40.842 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:40.842 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:40.842 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:40.843 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:40.843 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:40.843 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:40.843 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:40.843 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:40.843 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:40.843 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:41.102 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:41.102 "name": "Existed_Raid", 00:10:41.102 "uuid": "1faf1461-493a-4fb2-85be-7d304fbe03ea", 00:10:41.102 "strip_size_kb": 64, 00:10:41.102 "state": "configuring", 00:10:41.102 "raid_level": "raid0", 00:10:41.102 "superblock": true, 00:10:41.102 "num_base_bdevs": 2, 00:10:41.102 "num_base_bdevs_discovered": 0, 00:10:41.102 "num_base_bdevs_operational": 2, 00:10:41.102 "base_bdevs_list": [ 00:10:41.102 { 00:10:41.102 "name": "BaseBdev1", 00:10:41.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:41.102 "is_configured": false, 00:10:41.102 "data_offset": 0, 00:10:41.102 "data_size": 0 00:10:41.102 }, 00:10:41.102 { 00:10:41.102 "name": "BaseBdev2", 00:10:41.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:41.102 "is_configured": false, 00:10:41.102 "data_offset": 0, 00:10:41.102 "data_size": 0 00:10:41.102 } 00:10:41.102 ] 00:10:41.102 }' 00:10:41.102 13:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:41.102 13:36:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:41.669 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:41.669 [2024-07-12 13:36:30.225528] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:41.669 [2024-07-12 13:36:30.225561] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x141b330 name Existed_Raid, state configuring 00:10:41.669 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:41.928 [2024-07-12 13:36:30.474210] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:41.928 [2024-07-12 13:36:30.474243] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:41.928 [2024-07-12 13:36:30.474252] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:41.928 [2024-07-12 13:36:30.474264] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:41.928 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:42.188 [2024-07-12 13:36:30.732726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:42.188 BaseBdev1 00:10:42.188 13:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:42.188 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:42.188 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:42.188 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:42.188 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:42.188 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:42.188 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:42.448 13:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:42.707 [ 00:10:42.707 { 00:10:42.707 "name": "BaseBdev1", 00:10:42.707 "aliases": [ 00:10:42.707 "05c12cef-f5ad-4d05-bdb5-e3c85bfa42f9" 00:10:42.707 ], 00:10:42.707 "product_name": "Malloc disk", 00:10:42.707 "block_size": 512, 00:10:42.707 "num_blocks": 65536, 00:10:42.707 "uuid": "05c12cef-f5ad-4d05-bdb5-e3c85bfa42f9", 00:10:42.707 "assigned_rate_limits": { 00:10:42.707 "rw_ios_per_sec": 0, 00:10:42.707 "rw_mbytes_per_sec": 0, 00:10:42.707 "r_mbytes_per_sec": 0, 00:10:42.707 "w_mbytes_per_sec": 0 00:10:42.707 }, 00:10:42.707 "claimed": true, 00:10:42.707 "claim_type": "exclusive_write", 00:10:42.707 "zoned": false, 00:10:42.707 "supported_io_types": { 00:10:42.707 "read": true, 00:10:42.707 "write": true, 00:10:42.707 "unmap": true, 00:10:42.707 "flush": true, 00:10:42.707 "reset": true, 00:10:42.707 "nvme_admin": false, 00:10:42.707 "nvme_io": false, 00:10:42.707 "nvme_io_md": false, 00:10:42.707 "write_zeroes": true, 00:10:42.707 "zcopy": true, 00:10:42.707 "get_zone_info": false, 00:10:42.707 "zone_management": false, 00:10:42.707 "zone_append": false, 00:10:42.707 "compare": false, 00:10:42.707 "compare_and_write": false, 00:10:42.707 "abort": true, 00:10:42.707 "seek_hole": false, 00:10:42.707 "seek_data": false, 00:10:42.707 "copy": true, 00:10:42.707 "nvme_iov_md": false 00:10:42.707 }, 00:10:42.707 "memory_domains": [ 00:10:42.707 { 00:10:42.707 "dma_device_id": "system", 00:10:42.707 "dma_device_type": 1 00:10:42.707 }, 00:10:42.707 { 00:10:42.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:42.707 "dma_device_type": 2 00:10:42.707 } 00:10:42.707 ], 00:10:42.707 "driver_specific": {} 00:10:42.707 } 00:10:42.707 ] 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:42.707 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:42.966 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:42.966 "name": "Existed_Raid", 00:10:42.966 "uuid": "0b555c6b-f5ea-4dfc-a702-1f1575ab44af", 00:10:42.966 "strip_size_kb": 64, 00:10:42.966 "state": "configuring", 00:10:42.966 "raid_level": "raid0", 00:10:42.966 "superblock": true, 00:10:42.966 "num_base_bdevs": 2, 00:10:42.966 "num_base_bdevs_discovered": 1, 00:10:42.966 "num_base_bdevs_operational": 2, 00:10:42.966 "base_bdevs_list": [ 00:10:42.966 { 00:10:42.966 "name": "BaseBdev1", 00:10:42.966 "uuid": "05c12cef-f5ad-4d05-bdb5-e3c85bfa42f9", 00:10:42.966 "is_configured": true, 00:10:42.966 "data_offset": 2048, 00:10:42.966 "data_size": 63488 00:10:42.966 }, 00:10:42.966 { 00:10:42.966 "name": "BaseBdev2", 00:10:42.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:42.966 "is_configured": false, 00:10:42.966 "data_offset": 0, 00:10:42.966 "data_size": 0 00:10:42.966 } 00:10:42.966 ] 00:10:42.966 }' 00:10:42.966 13:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:42.966 13:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:43.903 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:43.903 [2024-07-12 13:36:32.421207] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:43.903 [2024-07-12 13:36:32.421246] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x141ac20 name Existed_Raid, state configuring 00:10:43.903 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:44.161 [2024-07-12 13:36:32.673922] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:44.161 [2024-07-12 13:36:32.675405] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:44.162 [2024-07-12 13:36:32.675436] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:44.162 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:44.421 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:44.421 "name": "Existed_Raid", 00:10:44.421 "uuid": "4e0d8cb7-133c-4d73-8c5c-450e7e34c003", 00:10:44.421 "strip_size_kb": 64, 00:10:44.421 "state": "configuring", 00:10:44.421 "raid_level": "raid0", 00:10:44.421 "superblock": true, 00:10:44.421 "num_base_bdevs": 2, 00:10:44.421 "num_base_bdevs_discovered": 1, 00:10:44.421 "num_base_bdevs_operational": 2, 00:10:44.421 "base_bdevs_list": [ 00:10:44.421 { 00:10:44.421 "name": "BaseBdev1", 00:10:44.421 "uuid": "05c12cef-f5ad-4d05-bdb5-e3c85bfa42f9", 00:10:44.421 "is_configured": true, 00:10:44.421 "data_offset": 2048, 00:10:44.421 "data_size": 63488 00:10:44.421 }, 00:10:44.421 { 00:10:44.421 "name": "BaseBdev2", 00:10:44.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:44.421 "is_configured": false, 00:10:44.421 "data_offset": 0, 00:10:44.421 "data_size": 0 00:10:44.421 } 00:10:44.421 ] 00:10:44.421 }' 00:10:44.421 13:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:44.421 13:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:45.356 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:45.356 [2024-07-12 13:36:33.796212] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:45.356 [2024-07-12 13:36:33.796352] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x141ba10 00:10:45.356 [2024-07-12 13:36:33.796365] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:45.356 [2024-07-12 13:36:33.796535] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x141ab70 00:10:45.356 [2024-07-12 13:36:33.796651] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x141ba10 00:10:45.356 [2024-07-12 13:36:33.796661] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x141ba10 00:10:45.356 [2024-07-12 13:36:33.796755] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:45.356 BaseBdev2 00:10:45.356 13:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:45.356 13:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:45.356 13:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:45.356 13:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:45.356 13:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:45.356 13:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:45.356 13:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:45.614 13:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:45.614 [ 00:10:45.614 { 00:10:45.614 "name": "BaseBdev2", 00:10:45.614 "aliases": [ 00:10:45.614 "46b7e8ab-2469-48c0-91a3-7bb6c516a1b0" 00:10:45.614 ], 00:10:45.614 "product_name": "Malloc disk", 00:10:45.614 "block_size": 512, 00:10:45.614 "num_blocks": 65536, 00:10:45.615 "uuid": "46b7e8ab-2469-48c0-91a3-7bb6c516a1b0", 00:10:45.615 "assigned_rate_limits": { 00:10:45.615 "rw_ios_per_sec": 0, 00:10:45.615 "rw_mbytes_per_sec": 0, 00:10:45.615 "r_mbytes_per_sec": 0, 00:10:45.615 "w_mbytes_per_sec": 0 00:10:45.615 }, 00:10:45.615 "claimed": true, 00:10:45.615 "claim_type": "exclusive_write", 00:10:45.615 "zoned": false, 00:10:45.615 "supported_io_types": { 00:10:45.615 "read": true, 00:10:45.615 "write": true, 00:10:45.615 "unmap": true, 00:10:45.615 "flush": true, 00:10:45.615 "reset": true, 00:10:45.615 "nvme_admin": false, 00:10:45.615 "nvme_io": false, 00:10:45.615 "nvme_io_md": false, 00:10:45.615 "write_zeroes": true, 00:10:45.615 "zcopy": true, 00:10:45.615 "get_zone_info": false, 00:10:45.615 "zone_management": false, 00:10:45.615 "zone_append": false, 00:10:45.615 "compare": false, 00:10:45.615 "compare_and_write": false, 00:10:45.615 "abort": true, 00:10:45.615 "seek_hole": false, 00:10:45.615 "seek_data": false, 00:10:45.615 "copy": true, 00:10:45.615 "nvme_iov_md": false 00:10:45.615 }, 00:10:45.615 "memory_domains": [ 00:10:45.615 { 00:10:45.615 "dma_device_id": "system", 00:10:45.615 "dma_device_type": 1 00:10:45.615 }, 00:10:45.615 { 00:10:45.615 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:45.615 "dma_device_type": 2 00:10:45.615 } 00:10:45.615 ], 00:10:45.615 "driver_specific": {} 00:10:45.615 } 00:10:45.615 ] 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.615 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:45.873 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:45.873 "name": "Existed_Raid", 00:10:45.873 "uuid": "4e0d8cb7-133c-4d73-8c5c-450e7e34c003", 00:10:45.873 "strip_size_kb": 64, 00:10:45.873 "state": "online", 00:10:45.873 "raid_level": "raid0", 00:10:45.873 "superblock": true, 00:10:45.873 "num_base_bdevs": 2, 00:10:45.873 "num_base_bdevs_discovered": 2, 00:10:45.873 "num_base_bdevs_operational": 2, 00:10:45.873 "base_bdevs_list": [ 00:10:45.873 { 00:10:45.873 "name": "BaseBdev1", 00:10:45.873 "uuid": "05c12cef-f5ad-4d05-bdb5-e3c85bfa42f9", 00:10:45.873 "is_configured": true, 00:10:45.873 "data_offset": 2048, 00:10:45.873 "data_size": 63488 00:10:45.873 }, 00:10:45.873 { 00:10:45.873 "name": "BaseBdev2", 00:10:45.873 "uuid": "46b7e8ab-2469-48c0-91a3-7bb6c516a1b0", 00:10:45.873 "is_configured": true, 00:10:45.873 "data_offset": 2048, 00:10:45.873 "data_size": 63488 00:10:45.873 } 00:10:45.873 ] 00:10:45.873 }' 00:10:45.873 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:45.873 13:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:46.440 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:46.440 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:46.440 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:46.440 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:46.440 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:46.440 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:46.440 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:46.440 13:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:46.700 [2024-07-12 13:36:35.152084] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:46.700 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:46.700 "name": "Existed_Raid", 00:10:46.700 "aliases": [ 00:10:46.700 "4e0d8cb7-133c-4d73-8c5c-450e7e34c003" 00:10:46.700 ], 00:10:46.700 "product_name": "Raid Volume", 00:10:46.700 "block_size": 512, 00:10:46.700 "num_blocks": 126976, 00:10:46.700 "uuid": "4e0d8cb7-133c-4d73-8c5c-450e7e34c003", 00:10:46.700 "assigned_rate_limits": { 00:10:46.700 "rw_ios_per_sec": 0, 00:10:46.700 "rw_mbytes_per_sec": 0, 00:10:46.700 "r_mbytes_per_sec": 0, 00:10:46.700 "w_mbytes_per_sec": 0 00:10:46.700 }, 00:10:46.700 "claimed": false, 00:10:46.700 "zoned": false, 00:10:46.700 "supported_io_types": { 00:10:46.700 "read": true, 00:10:46.700 "write": true, 00:10:46.700 "unmap": true, 00:10:46.700 "flush": true, 00:10:46.700 "reset": true, 00:10:46.700 "nvme_admin": false, 00:10:46.700 "nvme_io": false, 00:10:46.700 "nvme_io_md": false, 00:10:46.700 "write_zeroes": true, 00:10:46.700 "zcopy": false, 00:10:46.700 "get_zone_info": false, 00:10:46.700 "zone_management": false, 00:10:46.700 "zone_append": false, 00:10:46.700 "compare": false, 00:10:46.700 "compare_and_write": false, 00:10:46.700 "abort": false, 00:10:46.700 "seek_hole": false, 00:10:46.700 "seek_data": false, 00:10:46.700 "copy": false, 00:10:46.700 "nvme_iov_md": false 00:10:46.700 }, 00:10:46.700 "memory_domains": [ 00:10:46.700 { 00:10:46.700 "dma_device_id": "system", 00:10:46.700 "dma_device_type": 1 00:10:46.700 }, 00:10:46.700 { 00:10:46.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.700 "dma_device_type": 2 00:10:46.700 }, 00:10:46.700 { 00:10:46.700 "dma_device_id": "system", 00:10:46.700 "dma_device_type": 1 00:10:46.700 }, 00:10:46.700 { 00:10:46.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.700 "dma_device_type": 2 00:10:46.700 } 00:10:46.700 ], 00:10:46.700 "driver_specific": { 00:10:46.700 "raid": { 00:10:46.700 "uuid": "4e0d8cb7-133c-4d73-8c5c-450e7e34c003", 00:10:46.700 "strip_size_kb": 64, 00:10:46.700 "state": "online", 00:10:46.700 "raid_level": "raid0", 00:10:46.700 "superblock": true, 00:10:46.700 "num_base_bdevs": 2, 00:10:46.700 "num_base_bdevs_discovered": 2, 00:10:46.700 "num_base_bdevs_operational": 2, 00:10:46.700 "base_bdevs_list": [ 00:10:46.700 { 00:10:46.700 "name": "BaseBdev1", 00:10:46.700 "uuid": "05c12cef-f5ad-4d05-bdb5-e3c85bfa42f9", 00:10:46.700 "is_configured": true, 00:10:46.700 "data_offset": 2048, 00:10:46.700 "data_size": 63488 00:10:46.700 }, 00:10:46.700 { 00:10:46.700 "name": "BaseBdev2", 00:10:46.700 "uuid": "46b7e8ab-2469-48c0-91a3-7bb6c516a1b0", 00:10:46.700 "is_configured": true, 00:10:46.700 "data_offset": 2048, 00:10:46.700 "data_size": 63488 00:10:46.700 } 00:10:46.700 ] 00:10:46.700 } 00:10:46.700 } 00:10:46.700 }' 00:10:46.700 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:46.700 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:46.700 BaseBdev2' 00:10:46.700 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:46.700 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:46.700 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:46.960 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:46.960 "name": "BaseBdev1", 00:10:46.960 "aliases": [ 00:10:46.960 "05c12cef-f5ad-4d05-bdb5-e3c85bfa42f9" 00:10:46.960 ], 00:10:46.960 "product_name": "Malloc disk", 00:10:46.960 "block_size": 512, 00:10:46.960 "num_blocks": 65536, 00:10:46.960 "uuid": "05c12cef-f5ad-4d05-bdb5-e3c85bfa42f9", 00:10:46.960 "assigned_rate_limits": { 00:10:46.960 "rw_ios_per_sec": 0, 00:10:46.960 "rw_mbytes_per_sec": 0, 00:10:46.960 "r_mbytes_per_sec": 0, 00:10:46.960 "w_mbytes_per_sec": 0 00:10:46.960 }, 00:10:46.960 "claimed": true, 00:10:46.960 "claim_type": "exclusive_write", 00:10:46.960 "zoned": false, 00:10:46.960 "supported_io_types": { 00:10:46.960 "read": true, 00:10:46.960 "write": true, 00:10:46.960 "unmap": true, 00:10:46.960 "flush": true, 00:10:46.960 "reset": true, 00:10:46.960 "nvme_admin": false, 00:10:46.960 "nvme_io": false, 00:10:46.960 "nvme_io_md": false, 00:10:46.960 "write_zeroes": true, 00:10:46.960 "zcopy": true, 00:10:46.960 "get_zone_info": false, 00:10:46.960 "zone_management": false, 00:10:46.960 "zone_append": false, 00:10:46.960 "compare": false, 00:10:46.960 "compare_and_write": false, 00:10:46.960 "abort": true, 00:10:46.960 "seek_hole": false, 00:10:46.960 "seek_data": false, 00:10:46.960 "copy": true, 00:10:46.960 "nvme_iov_md": false 00:10:46.960 }, 00:10:46.960 "memory_domains": [ 00:10:46.960 { 00:10:46.960 "dma_device_id": "system", 00:10:46.960 "dma_device_type": 1 00:10:46.960 }, 00:10:46.960 { 00:10:46.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.960 "dma_device_type": 2 00:10:46.960 } 00:10:46.960 ], 00:10:46.960 "driver_specific": {} 00:10:46.960 }' 00:10:46.960 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:46.960 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:47.218 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:47.218 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:47.218 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:47.218 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:47.218 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:47.218 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:47.219 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:47.219 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:47.219 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:47.219 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:47.219 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:47.219 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:47.219 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:47.478 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:47.478 "name": "BaseBdev2", 00:10:47.478 "aliases": [ 00:10:47.478 "46b7e8ab-2469-48c0-91a3-7bb6c516a1b0" 00:10:47.478 ], 00:10:47.478 "product_name": "Malloc disk", 00:10:47.478 "block_size": 512, 00:10:47.478 "num_blocks": 65536, 00:10:47.478 "uuid": "46b7e8ab-2469-48c0-91a3-7bb6c516a1b0", 00:10:47.478 "assigned_rate_limits": { 00:10:47.478 "rw_ios_per_sec": 0, 00:10:47.478 "rw_mbytes_per_sec": 0, 00:10:47.478 "r_mbytes_per_sec": 0, 00:10:47.478 "w_mbytes_per_sec": 0 00:10:47.478 }, 00:10:47.478 "claimed": true, 00:10:47.478 "claim_type": "exclusive_write", 00:10:47.478 "zoned": false, 00:10:47.478 "supported_io_types": { 00:10:47.478 "read": true, 00:10:47.478 "write": true, 00:10:47.478 "unmap": true, 00:10:47.478 "flush": true, 00:10:47.478 "reset": true, 00:10:47.478 "nvme_admin": false, 00:10:47.478 "nvme_io": false, 00:10:47.478 "nvme_io_md": false, 00:10:47.478 "write_zeroes": true, 00:10:47.478 "zcopy": true, 00:10:47.478 "get_zone_info": false, 00:10:47.478 "zone_management": false, 00:10:47.478 "zone_append": false, 00:10:47.478 "compare": false, 00:10:47.478 "compare_and_write": false, 00:10:47.478 "abort": true, 00:10:47.478 "seek_hole": false, 00:10:47.478 "seek_data": false, 00:10:47.478 "copy": true, 00:10:47.478 "nvme_iov_md": false 00:10:47.478 }, 00:10:47.478 "memory_domains": [ 00:10:47.478 { 00:10:47.478 "dma_device_id": "system", 00:10:47.478 "dma_device_type": 1 00:10:47.478 }, 00:10:47.478 { 00:10:47.478 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:47.478 "dma_device_type": 2 00:10:47.478 } 00:10:47.478 ], 00:10:47.478 "driver_specific": {} 00:10:47.478 }' 00:10:47.478 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:47.478 13:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:47.478 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:47.478 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:47.737 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:47.737 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:47.737 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:47.737 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:47.737 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:47.737 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:47.737 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:47.737 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:47.737 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:47.997 [2024-07-12 13:36:36.507471] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:47.997 [2024-07-12 13:36:36.507498] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:47.997 [2024-07-12 13:36:36.507537] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.997 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:48.256 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:48.256 "name": "Existed_Raid", 00:10:48.256 "uuid": "4e0d8cb7-133c-4d73-8c5c-450e7e34c003", 00:10:48.256 "strip_size_kb": 64, 00:10:48.256 "state": "offline", 00:10:48.256 "raid_level": "raid0", 00:10:48.256 "superblock": true, 00:10:48.256 "num_base_bdevs": 2, 00:10:48.256 "num_base_bdevs_discovered": 1, 00:10:48.256 "num_base_bdevs_operational": 1, 00:10:48.256 "base_bdevs_list": [ 00:10:48.256 { 00:10:48.256 "name": null, 00:10:48.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:48.256 "is_configured": false, 00:10:48.256 "data_offset": 2048, 00:10:48.256 "data_size": 63488 00:10:48.256 }, 00:10:48.256 { 00:10:48.256 "name": "BaseBdev2", 00:10:48.256 "uuid": "46b7e8ab-2469-48c0-91a3-7bb6c516a1b0", 00:10:48.256 "is_configured": true, 00:10:48.256 "data_offset": 2048, 00:10:48.256 "data_size": 63488 00:10:48.256 } 00:10:48.256 ] 00:10:48.256 }' 00:10:48.256 13:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:48.256 13:36:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:48.825 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:48.825 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:48.825 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.825 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:49.085 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:49.085 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:49.085 13:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:49.651 [2024-07-12 13:36:38.073559] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:49.651 [2024-07-12 13:36:38.073610] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x141ba10 name Existed_Raid, state offline 00:10:49.651 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:49.651 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:49.651 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:49.651 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 427637 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 427637 ']' 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 427637 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 427637 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 427637' 00:10:49.910 killing process with pid 427637 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 427637 00:10:49.910 [2024-07-12 13:36:38.410538] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:49.910 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 427637 00:10:49.910 [2024-07-12 13:36:38.411410] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:50.169 13:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:50.169 00:10:50.169 real 0m10.202s 00:10:50.169 user 0m18.561s 00:10:50.169 sys 0m1.985s 00:10:50.169 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:50.169 13:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:50.169 ************************************ 00:10:50.169 END TEST raid_state_function_test_sb 00:10:50.169 ************************************ 00:10:50.169 13:36:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:50.169 13:36:38 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:10:50.169 13:36:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:50.169 13:36:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:50.169 13:36:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:50.169 ************************************ 00:10:50.169 START TEST raid_superblock_test 00:10:50.169 ************************************ 00:10:50.169 13:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:10:50.169 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:10:50.169 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=429237 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 429237 /var/tmp/spdk-raid.sock 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 429237 ']' 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:50.170 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:50.170 13:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:50.430 [2024-07-12 13:36:38.771173] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:10:50.430 [2024-07-12 13:36:38.771253] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid429237 ] 00:10:50.430 [2024-07-12 13:36:38.915641] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:50.690 [2024-07-12 13:36:39.022903] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:50.690 [2024-07-12 13:36:39.092372] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:50.690 [2024-07-12 13:36:39.092430] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:51.257 13:36:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:51.257 13:36:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:10:51.257 13:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:51.257 13:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:51.257 13:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:51.257 13:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:51.257 13:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:51.257 13:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:51.257 13:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:51.257 13:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:51.258 13:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:51.516 malloc1 00:10:51.516 13:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:52.085 [2024-07-12 13:36:40.407529] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:52.085 [2024-07-12 13:36:40.407582] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:52.085 [2024-07-12 13:36:40.407604] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b10e90 00:10:52.085 [2024-07-12 13:36:40.407617] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:52.085 [2024-07-12 13:36:40.409347] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:52.085 [2024-07-12 13:36:40.409374] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:52.085 pt1 00:10:52.085 13:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:52.085 13:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:52.085 13:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:52.085 13:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:52.085 13:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:52.085 13:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:52.085 13:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:52.085 13:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:52.085 13:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:52.653 malloc2 00:10:52.653 13:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:52.653 [2024-07-12 13:36:41.187544] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:52.653 [2024-07-12 13:36:41.187592] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:52.653 [2024-07-12 13:36:41.187610] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1baefb0 00:10:52.653 [2024-07-12 13:36:41.187622] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:52.653 [2024-07-12 13:36:41.189192] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:52.653 [2024-07-12 13:36:41.189220] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:52.653 pt2 00:10:52.653 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:52.653 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:52.653 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:53.222 [2024-07-12 13:36:41.684864] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:53.222 [2024-07-12 13:36:41.686242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:53.222 [2024-07-12 13:36:41.686383] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1baf6b0 00:10:53.222 [2024-07-12 13:36:41.686396] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:53.222 [2024-07-12 13:36:41.686598] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b12220 00:10:53.222 [2024-07-12 13:36:41.686735] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1baf6b0 00:10:53.222 [2024-07-12 13:36:41.686746] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1baf6b0 00:10:53.222 [2024-07-12 13:36:41.686847] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:53.222 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:53.482 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:53.482 "name": "raid_bdev1", 00:10:53.482 "uuid": "35dee301-5ee7-433f-af47-5e29f9b5cf93", 00:10:53.482 "strip_size_kb": 64, 00:10:53.482 "state": "online", 00:10:53.482 "raid_level": "raid0", 00:10:53.482 "superblock": true, 00:10:53.482 "num_base_bdevs": 2, 00:10:53.482 "num_base_bdevs_discovered": 2, 00:10:53.482 "num_base_bdevs_operational": 2, 00:10:53.482 "base_bdevs_list": [ 00:10:53.482 { 00:10:53.482 "name": "pt1", 00:10:53.482 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:53.482 "is_configured": true, 00:10:53.482 "data_offset": 2048, 00:10:53.482 "data_size": 63488 00:10:53.482 }, 00:10:53.482 { 00:10:53.482 "name": "pt2", 00:10:53.482 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:53.482 "is_configured": true, 00:10:53.482 "data_offset": 2048, 00:10:53.482 "data_size": 63488 00:10:53.482 } 00:10:53.482 ] 00:10:53.482 }' 00:10:53.482 13:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:53.482 13:36:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:54.051 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:54.051 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:54.051 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:54.051 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:54.051 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:54.051 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:54.051 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:54.051 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:54.310 [2024-07-12 13:36:42.824096] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:54.310 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:54.310 "name": "raid_bdev1", 00:10:54.310 "aliases": [ 00:10:54.311 "35dee301-5ee7-433f-af47-5e29f9b5cf93" 00:10:54.311 ], 00:10:54.311 "product_name": "Raid Volume", 00:10:54.311 "block_size": 512, 00:10:54.311 "num_blocks": 126976, 00:10:54.311 "uuid": "35dee301-5ee7-433f-af47-5e29f9b5cf93", 00:10:54.311 "assigned_rate_limits": { 00:10:54.311 "rw_ios_per_sec": 0, 00:10:54.311 "rw_mbytes_per_sec": 0, 00:10:54.311 "r_mbytes_per_sec": 0, 00:10:54.311 "w_mbytes_per_sec": 0 00:10:54.311 }, 00:10:54.311 "claimed": false, 00:10:54.311 "zoned": false, 00:10:54.311 "supported_io_types": { 00:10:54.311 "read": true, 00:10:54.311 "write": true, 00:10:54.311 "unmap": true, 00:10:54.311 "flush": true, 00:10:54.311 "reset": true, 00:10:54.311 "nvme_admin": false, 00:10:54.311 "nvme_io": false, 00:10:54.311 "nvme_io_md": false, 00:10:54.311 "write_zeroes": true, 00:10:54.311 "zcopy": false, 00:10:54.311 "get_zone_info": false, 00:10:54.311 "zone_management": false, 00:10:54.311 "zone_append": false, 00:10:54.311 "compare": false, 00:10:54.311 "compare_and_write": false, 00:10:54.311 "abort": false, 00:10:54.311 "seek_hole": false, 00:10:54.311 "seek_data": false, 00:10:54.311 "copy": false, 00:10:54.311 "nvme_iov_md": false 00:10:54.311 }, 00:10:54.311 "memory_domains": [ 00:10:54.311 { 00:10:54.311 "dma_device_id": "system", 00:10:54.311 "dma_device_type": 1 00:10:54.311 }, 00:10:54.311 { 00:10:54.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.311 "dma_device_type": 2 00:10:54.311 }, 00:10:54.311 { 00:10:54.311 "dma_device_id": "system", 00:10:54.311 "dma_device_type": 1 00:10:54.311 }, 00:10:54.311 { 00:10:54.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.311 "dma_device_type": 2 00:10:54.311 } 00:10:54.311 ], 00:10:54.311 "driver_specific": { 00:10:54.311 "raid": { 00:10:54.311 "uuid": "35dee301-5ee7-433f-af47-5e29f9b5cf93", 00:10:54.311 "strip_size_kb": 64, 00:10:54.311 "state": "online", 00:10:54.311 "raid_level": "raid0", 00:10:54.311 "superblock": true, 00:10:54.311 "num_base_bdevs": 2, 00:10:54.311 "num_base_bdevs_discovered": 2, 00:10:54.311 "num_base_bdevs_operational": 2, 00:10:54.311 "base_bdevs_list": [ 00:10:54.311 { 00:10:54.311 "name": "pt1", 00:10:54.311 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:54.311 "is_configured": true, 00:10:54.311 "data_offset": 2048, 00:10:54.311 "data_size": 63488 00:10:54.311 }, 00:10:54.311 { 00:10:54.311 "name": "pt2", 00:10:54.311 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:54.311 "is_configured": true, 00:10:54.311 "data_offset": 2048, 00:10:54.311 "data_size": 63488 00:10:54.311 } 00:10:54.311 ] 00:10:54.311 } 00:10:54.311 } 00:10:54.311 }' 00:10:54.311 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:54.311 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:54.311 pt2' 00:10:54.311 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:54.572 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:54.572 13:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:54.572 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:54.572 "name": "pt1", 00:10:54.572 "aliases": [ 00:10:54.572 "00000000-0000-0000-0000-000000000001" 00:10:54.572 ], 00:10:54.572 "product_name": "passthru", 00:10:54.572 "block_size": 512, 00:10:54.572 "num_blocks": 65536, 00:10:54.573 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:54.573 "assigned_rate_limits": { 00:10:54.573 "rw_ios_per_sec": 0, 00:10:54.573 "rw_mbytes_per_sec": 0, 00:10:54.573 "r_mbytes_per_sec": 0, 00:10:54.573 "w_mbytes_per_sec": 0 00:10:54.573 }, 00:10:54.573 "claimed": true, 00:10:54.573 "claim_type": "exclusive_write", 00:10:54.573 "zoned": false, 00:10:54.573 "supported_io_types": { 00:10:54.573 "read": true, 00:10:54.573 "write": true, 00:10:54.573 "unmap": true, 00:10:54.573 "flush": true, 00:10:54.573 "reset": true, 00:10:54.573 "nvme_admin": false, 00:10:54.573 "nvme_io": false, 00:10:54.573 "nvme_io_md": false, 00:10:54.573 "write_zeroes": true, 00:10:54.573 "zcopy": true, 00:10:54.573 "get_zone_info": false, 00:10:54.573 "zone_management": false, 00:10:54.573 "zone_append": false, 00:10:54.573 "compare": false, 00:10:54.573 "compare_and_write": false, 00:10:54.573 "abort": true, 00:10:54.573 "seek_hole": false, 00:10:54.573 "seek_data": false, 00:10:54.573 "copy": true, 00:10:54.573 "nvme_iov_md": false 00:10:54.573 }, 00:10:54.573 "memory_domains": [ 00:10:54.573 { 00:10:54.573 "dma_device_id": "system", 00:10:54.573 "dma_device_type": 1 00:10:54.573 }, 00:10:54.573 { 00:10:54.573 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.573 "dma_device_type": 2 00:10:54.573 } 00:10:54.573 ], 00:10:54.573 "driver_specific": { 00:10:54.573 "passthru": { 00:10:54.573 "name": "pt1", 00:10:54.573 "base_bdev_name": "malloc1" 00:10:54.573 } 00:10:54.573 } 00:10:54.573 }' 00:10:54.573 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:54.832 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:54.832 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:54.832 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:54.832 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:54.832 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:54.832 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:54.832 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:54.832 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:54.832 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.090 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.090 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:55.090 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:55.090 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:55.090 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:55.349 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:55.349 "name": "pt2", 00:10:55.349 "aliases": [ 00:10:55.349 "00000000-0000-0000-0000-000000000002" 00:10:55.349 ], 00:10:55.349 "product_name": "passthru", 00:10:55.349 "block_size": 512, 00:10:55.349 "num_blocks": 65536, 00:10:55.349 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:55.349 "assigned_rate_limits": { 00:10:55.349 "rw_ios_per_sec": 0, 00:10:55.349 "rw_mbytes_per_sec": 0, 00:10:55.349 "r_mbytes_per_sec": 0, 00:10:55.349 "w_mbytes_per_sec": 0 00:10:55.349 }, 00:10:55.349 "claimed": true, 00:10:55.349 "claim_type": "exclusive_write", 00:10:55.349 "zoned": false, 00:10:55.349 "supported_io_types": { 00:10:55.349 "read": true, 00:10:55.349 "write": true, 00:10:55.349 "unmap": true, 00:10:55.349 "flush": true, 00:10:55.349 "reset": true, 00:10:55.349 "nvme_admin": false, 00:10:55.349 "nvme_io": false, 00:10:55.349 "nvme_io_md": false, 00:10:55.349 "write_zeroes": true, 00:10:55.349 "zcopy": true, 00:10:55.349 "get_zone_info": false, 00:10:55.349 "zone_management": false, 00:10:55.349 "zone_append": false, 00:10:55.349 "compare": false, 00:10:55.349 "compare_and_write": false, 00:10:55.349 "abort": true, 00:10:55.349 "seek_hole": false, 00:10:55.349 "seek_data": false, 00:10:55.349 "copy": true, 00:10:55.349 "nvme_iov_md": false 00:10:55.349 }, 00:10:55.349 "memory_domains": [ 00:10:55.349 { 00:10:55.349 "dma_device_id": "system", 00:10:55.349 "dma_device_type": 1 00:10:55.349 }, 00:10:55.349 { 00:10:55.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.349 "dma_device_type": 2 00:10:55.349 } 00:10:55.349 ], 00:10:55.349 "driver_specific": { 00:10:55.349 "passthru": { 00:10:55.349 "name": "pt2", 00:10:55.349 "base_bdev_name": "malloc2" 00:10:55.349 } 00:10:55.349 } 00:10:55.349 }' 00:10:55.349 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.349 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.349 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:55.349 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.349 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.349 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:55.349 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.608 13:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.608 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:55.608 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.608 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.608 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:55.608 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:55.608 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:56.173 [2024-07-12 13:36:44.560725] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:56.173 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=35dee301-5ee7-433f-af47-5e29f9b5cf93 00:10:56.173 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 35dee301-5ee7-433f-af47-5e29f9b5cf93 ']' 00:10:56.173 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:56.432 [2024-07-12 13:36:44.817140] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:56.432 [2024-07-12 13:36:44.817164] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:56.432 [2024-07-12 13:36:44.817225] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:56.432 [2024-07-12 13:36:44.817268] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:56.432 [2024-07-12 13:36:44.817280] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1baf6b0 name raid_bdev1, state offline 00:10:56.432 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.432 13:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:56.690 13:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:56.690 13:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:56.690 13:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:56.690 13:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:57.255 13:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:57.255 13:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:57.514 13:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:57.514 13:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:57.773 [2024-07-12 13:36:46.333078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:57.773 [2024-07-12 13:36:46.334465] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:57.773 [2024-07-12 13:36:46.334519] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:57.773 [2024-07-12 13:36:46.334558] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:57.773 [2024-07-12 13:36:46.334576] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:57.773 [2024-07-12 13:36:46.334585] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b13440 name raid_bdev1, state configuring 00:10:57.773 request: 00:10:57.773 { 00:10:57.773 "name": "raid_bdev1", 00:10:57.773 "raid_level": "raid0", 00:10:57.773 "base_bdevs": [ 00:10:57.773 "malloc1", 00:10:57.773 "malloc2" 00:10:57.773 ], 00:10:57.773 "strip_size_kb": 64, 00:10:57.773 "superblock": false, 00:10:57.773 "method": "bdev_raid_create", 00:10:57.773 "req_id": 1 00:10:57.773 } 00:10:57.773 Got JSON-RPC error response 00:10:57.773 response: 00:10:57.773 { 00:10:57.773 "code": -17, 00:10:57.773 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:57.773 } 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.773 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:58.031 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:58.031 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:58.031 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:58.290 [2024-07-12 13:36:46.838359] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:58.290 [2024-07-12 13:36:46.838407] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:58.290 [2024-07-12 13:36:46.838427] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b14040 00:10:58.290 [2024-07-12 13:36:46.838439] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:58.290 [2024-07-12 13:36:46.840090] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:58.290 [2024-07-12 13:36:46.840118] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:58.290 [2024-07-12 13:36:46.840184] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:58.290 [2024-07-12 13:36:46.840209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:58.290 pt1 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.290 13:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:58.548 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:58.548 "name": "raid_bdev1", 00:10:58.548 "uuid": "35dee301-5ee7-433f-af47-5e29f9b5cf93", 00:10:58.548 "strip_size_kb": 64, 00:10:58.548 "state": "configuring", 00:10:58.548 "raid_level": "raid0", 00:10:58.548 "superblock": true, 00:10:58.548 "num_base_bdevs": 2, 00:10:58.548 "num_base_bdevs_discovered": 1, 00:10:58.548 "num_base_bdevs_operational": 2, 00:10:58.548 "base_bdevs_list": [ 00:10:58.548 { 00:10:58.548 "name": "pt1", 00:10:58.548 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:58.548 "is_configured": true, 00:10:58.548 "data_offset": 2048, 00:10:58.548 "data_size": 63488 00:10:58.548 }, 00:10:58.548 { 00:10:58.548 "name": null, 00:10:58.548 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:58.548 "is_configured": false, 00:10:58.548 "data_offset": 2048, 00:10:58.548 "data_size": 63488 00:10:58.548 } 00:10:58.548 ] 00:10:58.548 }' 00:10:58.548 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:58.548 13:36:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:59.113 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:59.113 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:59.113 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:59.113 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:59.371 [2024-07-12 13:36:47.845179] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:59.371 [2024-07-12 13:36:47.845226] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:59.371 [2024-07-12 13:36:47.845244] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b11ab0 00:10:59.371 [2024-07-12 13:36:47.845256] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:59.371 [2024-07-12 13:36:47.845592] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:59.371 [2024-07-12 13:36:47.845608] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:59.371 [2024-07-12 13:36:47.845668] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:59.371 [2024-07-12 13:36:47.845687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:59.371 [2024-07-12 13:36:47.845779] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bb1970 00:10:59.371 [2024-07-12 13:36:47.845789] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:59.371 [2024-07-12 13:36:47.845969] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1baf580 00:10:59.371 [2024-07-12 13:36:47.846092] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bb1970 00:10:59.371 [2024-07-12 13:36:47.846101] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bb1970 00:10:59.371 [2024-07-12 13:36:47.846200] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:59.371 pt2 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.371 13:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:59.629 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:59.629 "name": "raid_bdev1", 00:10:59.629 "uuid": "35dee301-5ee7-433f-af47-5e29f9b5cf93", 00:10:59.629 "strip_size_kb": 64, 00:10:59.629 "state": "online", 00:10:59.629 "raid_level": "raid0", 00:10:59.629 "superblock": true, 00:10:59.629 "num_base_bdevs": 2, 00:10:59.629 "num_base_bdevs_discovered": 2, 00:10:59.629 "num_base_bdevs_operational": 2, 00:10:59.629 "base_bdevs_list": [ 00:10:59.629 { 00:10:59.629 "name": "pt1", 00:10:59.629 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:59.629 "is_configured": true, 00:10:59.629 "data_offset": 2048, 00:10:59.629 "data_size": 63488 00:10:59.629 }, 00:10:59.629 { 00:10:59.629 "name": "pt2", 00:10:59.629 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:59.629 "is_configured": true, 00:10:59.629 "data_offset": 2048, 00:10:59.629 "data_size": 63488 00:10:59.629 } 00:10:59.629 ] 00:10:59.629 }' 00:10:59.629 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:59.629 13:36:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:00.250 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:00.250 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:00.250 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:00.250 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:00.250 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:00.250 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:00.250 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:00.250 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:00.507 [2024-07-12 13:36:48.876179] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:00.507 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:00.507 "name": "raid_bdev1", 00:11:00.507 "aliases": [ 00:11:00.507 "35dee301-5ee7-433f-af47-5e29f9b5cf93" 00:11:00.507 ], 00:11:00.507 "product_name": "Raid Volume", 00:11:00.507 "block_size": 512, 00:11:00.507 "num_blocks": 126976, 00:11:00.507 "uuid": "35dee301-5ee7-433f-af47-5e29f9b5cf93", 00:11:00.507 "assigned_rate_limits": { 00:11:00.507 "rw_ios_per_sec": 0, 00:11:00.507 "rw_mbytes_per_sec": 0, 00:11:00.507 "r_mbytes_per_sec": 0, 00:11:00.507 "w_mbytes_per_sec": 0 00:11:00.507 }, 00:11:00.507 "claimed": false, 00:11:00.507 "zoned": false, 00:11:00.507 "supported_io_types": { 00:11:00.507 "read": true, 00:11:00.507 "write": true, 00:11:00.507 "unmap": true, 00:11:00.507 "flush": true, 00:11:00.507 "reset": true, 00:11:00.507 "nvme_admin": false, 00:11:00.507 "nvme_io": false, 00:11:00.507 "nvme_io_md": false, 00:11:00.507 "write_zeroes": true, 00:11:00.507 "zcopy": false, 00:11:00.507 "get_zone_info": false, 00:11:00.507 "zone_management": false, 00:11:00.507 "zone_append": false, 00:11:00.507 "compare": false, 00:11:00.507 "compare_and_write": false, 00:11:00.507 "abort": false, 00:11:00.507 "seek_hole": false, 00:11:00.507 "seek_data": false, 00:11:00.507 "copy": false, 00:11:00.507 "nvme_iov_md": false 00:11:00.507 }, 00:11:00.507 "memory_domains": [ 00:11:00.507 { 00:11:00.507 "dma_device_id": "system", 00:11:00.507 "dma_device_type": 1 00:11:00.507 }, 00:11:00.507 { 00:11:00.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.507 "dma_device_type": 2 00:11:00.507 }, 00:11:00.507 { 00:11:00.507 "dma_device_id": "system", 00:11:00.507 "dma_device_type": 1 00:11:00.507 }, 00:11:00.507 { 00:11:00.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.507 "dma_device_type": 2 00:11:00.507 } 00:11:00.507 ], 00:11:00.507 "driver_specific": { 00:11:00.507 "raid": { 00:11:00.507 "uuid": "35dee301-5ee7-433f-af47-5e29f9b5cf93", 00:11:00.507 "strip_size_kb": 64, 00:11:00.507 "state": "online", 00:11:00.507 "raid_level": "raid0", 00:11:00.507 "superblock": true, 00:11:00.507 "num_base_bdevs": 2, 00:11:00.507 "num_base_bdevs_discovered": 2, 00:11:00.507 "num_base_bdevs_operational": 2, 00:11:00.507 "base_bdevs_list": [ 00:11:00.507 { 00:11:00.507 "name": "pt1", 00:11:00.507 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:00.507 "is_configured": true, 00:11:00.507 "data_offset": 2048, 00:11:00.507 "data_size": 63488 00:11:00.507 }, 00:11:00.507 { 00:11:00.507 "name": "pt2", 00:11:00.507 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:00.507 "is_configured": true, 00:11:00.507 "data_offset": 2048, 00:11:00.507 "data_size": 63488 00:11:00.507 } 00:11:00.507 ] 00:11:00.507 } 00:11:00.507 } 00:11:00.507 }' 00:11:00.507 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:00.507 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:00.508 pt2' 00:11:00.508 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:00.508 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:00.508 13:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:00.764 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:00.764 "name": "pt1", 00:11:00.764 "aliases": [ 00:11:00.764 "00000000-0000-0000-0000-000000000001" 00:11:00.764 ], 00:11:00.764 "product_name": "passthru", 00:11:00.764 "block_size": 512, 00:11:00.764 "num_blocks": 65536, 00:11:00.764 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:00.764 "assigned_rate_limits": { 00:11:00.764 "rw_ios_per_sec": 0, 00:11:00.764 "rw_mbytes_per_sec": 0, 00:11:00.764 "r_mbytes_per_sec": 0, 00:11:00.764 "w_mbytes_per_sec": 0 00:11:00.764 }, 00:11:00.764 "claimed": true, 00:11:00.764 "claim_type": "exclusive_write", 00:11:00.764 "zoned": false, 00:11:00.764 "supported_io_types": { 00:11:00.764 "read": true, 00:11:00.764 "write": true, 00:11:00.764 "unmap": true, 00:11:00.764 "flush": true, 00:11:00.764 "reset": true, 00:11:00.764 "nvme_admin": false, 00:11:00.764 "nvme_io": false, 00:11:00.764 "nvme_io_md": false, 00:11:00.764 "write_zeroes": true, 00:11:00.764 "zcopy": true, 00:11:00.764 "get_zone_info": false, 00:11:00.764 "zone_management": false, 00:11:00.764 "zone_append": false, 00:11:00.764 "compare": false, 00:11:00.764 "compare_and_write": false, 00:11:00.764 "abort": true, 00:11:00.764 "seek_hole": false, 00:11:00.764 "seek_data": false, 00:11:00.764 "copy": true, 00:11:00.764 "nvme_iov_md": false 00:11:00.764 }, 00:11:00.764 "memory_domains": [ 00:11:00.764 { 00:11:00.764 "dma_device_id": "system", 00:11:00.764 "dma_device_type": 1 00:11:00.764 }, 00:11:00.764 { 00:11:00.764 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.764 "dma_device_type": 2 00:11:00.764 } 00:11:00.764 ], 00:11:00.764 "driver_specific": { 00:11:00.764 "passthru": { 00:11:00.764 "name": "pt1", 00:11:00.764 "base_bdev_name": "malloc1" 00:11:00.764 } 00:11:00.764 } 00:11:00.764 }' 00:11:00.764 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:00.764 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:00.764 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:00.764 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:00.764 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:01.021 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:01.021 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:01.021 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:01.021 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:01.021 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:01.021 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:01.021 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:01.021 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:01.021 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:01.021 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:01.279 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:01.279 "name": "pt2", 00:11:01.279 "aliases": [ 00:11:01.279 "00000000-0000-0000-0000-000000000002" 00:11:01.279 ], 00:11:01.279 "product_name": "passthru", 00:11:01.279 "block_size": 512, 00:11:01.279 "num_blocks": 65536, 00:11:01.279 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:01.279 "assigned_rate_limits": { 00:11:01.279 "rw_ios_per_sec": 0, 00:11:01.279 "rw_mbytes_per_sec": 0, 00:11:01.279 "r_mbytes_per_sec": 0, 00:11:01.279 "w_mbytes_per_sec": 0 00:11:01.279 }, 00:11:01.279 "claimed": true, 00:11:01.279 "claim_type": "exclusive_write", 00:11:01.279 "zoned": false, 00:11:01.279 "supported_io_types": { 00:11:01.279 "read": true, 00:11:01.279 "write": true, 00:11:01.279 "unmap": true, 00:11:01.279 "flush": true, 00:11:01.279 "reset": true, 00:11:01.279 "nvme_admin": false, 00:11:01.279 "nvme_io": false, 00:11:01.279 "nvme_io_md": false, 00:11:01.279 "write_zeroes": true, 00:11:01.279 "zcopy": true, 00:11:01.279 "get_zone_info": false, 00:11:01.279 "zone_management": false, 00:11:01.279 "zone_append": false, 00:11:01.279 "compare": false, 00:11:01.279 "compare_and_write": false, 00:11:01.279 "abort": true, 00:11:01.279 "seek_hole": false, 00:11:01.279 "seek_data": false, 00:11:01.279 "copy": true, 00:11:01.279 "nvme_iov_md": false 00:11:01.279 }, 00:11:01.279 "memory_domains": [ 00:11:01.279 { 00:11:01.279 "dma_device_id": "system", 00:11:01.279 "dma_device_type": 1 00:11:01.279 }, 00:11:01.279 { 00:11:01.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:01.279 "dma_device_type": 2 00:11:01.279 } 00:11:01.279 ], 00:11:01.279 "driver_specific": { 00:11:01.279 "passthru": { 00:11:01.279 "name": "pt2", 00:11:01.279 "base_bdev_name": "malloc2" 00:11:01.279 } 00:11:01.279 } 00:11:01.279 }' 00:11:01.279 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:01.279 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:01.537 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:01.537 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:01.537 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:01.537 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:01.537 13:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:01.537 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:01.537 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:01.537 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:01.537 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:01.796 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:01.796 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:01.796 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:01.796 [2024-07-12 13:36:50.372173] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 35dee301-5ee7-433f-af47-5e29f9b5cf93 '!=' 35dee301-5ee7-433f-af47-5e29f9b5cf93 ']' 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 429237 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 429237 ']' 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 429237 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 429237 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 429237' 00:11:02.055 killing process with pid 429237 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 429237 00:11:02.055 [2024-07-12 13:36:50.449043] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:02.055 [2024-07-12 13:36:50.449098] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:02.055 [2024-07-12 13:36:50.449145] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:02.055 [2024-07-12 13:36:50.449157] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb1970 name raid_bdev1, state offline 00:11:02.055 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 429237 00:11:02.055 [2024-07-12 13:36:50.467079] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:02.315 13:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:02.315 00:11:02.315 real 0m11.986s 00:11:02.315 user 0m21.508s 00:11:02.315 sys 0m2.113s 00:11:02.315 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:02.315 13:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.315 ************************************ 00:11:02.315 END TEST raid_superblock_test 00:11:02.315 ************************************ 00:11:02.315 13:36:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:02.315 13:36:50 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:11:02.315 13:36:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:02.315 13:36:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:02.315 13:36:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:02.315 ************************************ 00:11:02.315 START TEST raid_read_error_test 00:11:02.315 ************************************ 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.cKdd2Sl9pG 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=430994 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 430994 /var/tmp/spdk-raid.sock 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 430994 ']' 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:02.315 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:02.315 13:36:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.315 [2024-07-12 13:36:50.849434] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:11:02.315 [2024-07-12 13:36:50.849498] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid430994 ] 00:11:02.575 [2024-07-12 13:36:50.979397] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:02.575 [2024-07-12 13:36:51.088020] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:02.575 [2024-07-12 13:36:51.151249] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:02.575 [2024-07-12 13:36:51.151283] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:02.834 13:36:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:02.834 13:36:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:02.834 13:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:02.834 13:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:03.092 BaseBdev1_malloc 00:11:03.093 13:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:03.352 true 00:11:03.352 13:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:03.612 [2024-07-12 13:36:52.027982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:03.612 [2024-07-12 13:36:52.028028] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:03.612 [2024-07-12 13:36:52.028049] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xddba10 00:11:03.612 [2024-07-12 13:36:52.028062] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:03.612 [2024-07-12 13:36:52.029975] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:03.612 [2024-07-12 13:36:52.030003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:03.612 BaseBdev1 00:11:03.612 13:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:03.612 13:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:03.872 BaseBdev2_malloc 00:11:03.872 13:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:04.131 true 00:11:04.131 13:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:04.391 [2024-07-12 13:36:52.763787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:04.391 [2024-07-12 13:36:52.763831] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:04.391 [2024-07-12 13:36:52.763859] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde0250 00:11:04.391 [2024-07-12 13:36:52.763872] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:04.391 [2024-07-12 13:36:52.765486] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:04.391 [2024-07-12 13:36:52.765513] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:04.391 BaseBdev2 00:11:04.391 13:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:04.649 [2024-07-12 13:36:53.008478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:04.649 [2024-07-12 13:36:53.009909] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:04.649 [2024-07-12 13:36:53.010106] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xde1c60 00:11:04.649 [2024-07-12 13:36:53.010119] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:04.649 [2024-07-12 13:36:53.010325] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xde0bb0 00:11:04.649 [2024-07-12 13:36:53.010474] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xde1c60 00:11:04.649 [2024-07-12 13:36:53.010484] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xde1c60 00:11:04.649 [2024-07-12 13:36:53.010594] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.649 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:04.907 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:04.907 "name": "raid_bdev1", 00:11:04.907 "uuid": "f7e27cd6-fc5e-46a4-a4bb-15ec468476a9", 00:11:04.907 "strip_size_kb": 64, 00:11:04.907 "state": "online", 00:11:04.907 "raid_level": "raid0", 00:11:04.907 "superblock": true, 00:11:04.907 "num_base_bdevs": 2, 00:11:04.907 "num_base_bdevs_discovered": 2, 00:11:04.907 "num_base_bdevs_operational": 2, 00:11:04.907 "base_bdevs_list": [ 00:11:04.907 { 00:11:04.907 "name": "BaseBdev1", 00:11:04.907 "uuid": "3c47cfcf-70b9-5d10-9114-bf7e69f4e077", 00:11:04.907 "is_configured": true, 00:11:04.907 "data_offset": 2048, 00:11:04.907 "data_size": 63488 00:11:04.907 }, 00:11:04.907 { 00:11:04.907 "name": "BaseBdev2", 00:11:04.907 "uuid": "45a367f3-1a5a-5521-b434-32aee96861fe", 00:11:04.907 "is_configured": true, 00:11:04.907 "data_offset": 2048, 00:11:04.908 "data_size": 63488 00:11:04.908 } 00:11:04.908 ] 00:11:04.908 }' 00:11:04.908 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:04.908 13:36:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:05.476 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:05.476 13:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:05.476 [2024-07-12 13:36:53.975310] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xddd2f0 00:11:06.413 13:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:06.673 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:06.933 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:06.933 "name": "raid_bdev1", 00:11:06.933 "uuid": "f7e27cd6-fc5e-46a4-a4bb-15ec468476a9", 00:11:06.933 "strip_size_kb": 64, 00:11:06.933 "state": "online", 00:11:06.933 "raid_level": "raid0", 00:11:06.933 "superblock": true, 00:11:06.933 "num_base_bdevs": 2, 00:11:06.933 "num_base_bdevs_discovered": 2, 00:11:06.933 "num_base_bdevs_operational": 2, 00:11:06.933 "base_bdevs_list": [ 00:11:06.933 { 00:11:06.933 "name": "BaseBdev1", 00:11:06.933 "uuid": "3c47cfcf-70b9-5d10-9114-bf7e69f4e077", 00:11:06.933 "is_configured": true, 00:11:06.933 "data_offset": 2048, 00:11:06.933 "data_size": 63488 00:11:06.933 }, 00:11:06.933 { 00:11:06.933 "name": "BaseBdev2", 00:11:06.933 "uuid": "45a367f3-1a5a-5521-b434-32aee96861fe", 00:11:06.933 "is_configured": true, 00:11:06.933 "data_offset": 2048, 00:11:06.933 "data_size": 63488 00:11:06.933 } 00:11:06.933 ] 00:11:06.933 }' 00:11:06.933 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:06.933 13:36:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:07.502 13:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:07.761 [2024-07-12 13:36:56.185004] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:07.761 [2024-07-12 13:36:56.185045] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:07.761 [2024-07-12 13:36:56.188229] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:07.761 [2024-07-12 13:36:56.188259] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:07.761 [2024-07-12 13:36:56.188286] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:07.761 [2024-07-12 13:36:56.188297] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xde1c60 name raid_bdev1, state offline 00:11:07.761 0 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 430994 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 430994 ']' 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 430994 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 430994 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 430994' 00:11:07.761 killing process with pid 430994 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 430994 00:11:07.761 [2024-07-12 13:36:56.267580] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:07.761 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 430994 00:11:07.761 [2024-07-12 13:36:56.277964] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:08.021 13:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.cKdd2Sl9pG 00:11:08.021 13:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:08.021 13:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:08.021 13:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:11:08.021 13:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:08.021 13:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:08.021 13:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:08.021 13:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:11:08.021 00:11:08.021 real 0m5.727s 00:11:08.021 user 0m9.271s 00:11:08.021 sys 0m1.068s 00:11:08.021 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:08.021 13:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.021 ************************************ 00:11:08.021 END TEST raid_read_error_test 00:11:08.021 ************************************ 00:11:08.021 13:36:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:08.021 13:36:56 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:11:08.021 13:36:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:08.021 13:36:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:08.021 13:36:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:08.021 ************************************ 00:11:08.021 START TEST raid_write_error_test 00:11:08.021 ************************************ 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:08.021 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.og22n7Lqpk 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=431865 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 431865 /var/tmp/spdk-raid.sock 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 431865 ']' 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:08.281 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:08.281 13:36:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.281 [2024-07-12 13:36:56.670215] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:11:08.281 [2024-07-12 13:36:56.670284] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid431865 ] 00:11:08.281 [2024-07-12 13:36:56.799496] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:08.540 [2024-07-12 13:36:56.906409] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:08.540 [2024-07-12 13:36:56.970022] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:08.540 [2024-07-12 13:36:56.970057] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:09.109 13:36:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:09.109 13:36:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:09.109 13:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:09.109 13:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:09.368 BaseBdev1_malloc 00:11:09.368 13:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:09.628 true 00:11:09.628 13:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:09.887 [2024-07-12 13:36:58.315973] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:09.887 [2024-07-12 13:36:58.316018] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:09.887 [2024-07-12 13:36:58.316041] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x137ba10 00:11:09.887 [2024-07-12 13:36:58.316054] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:09.887 [2024-07-12 13:36:58.317970] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:09.887 [2024-07-12 13:36:58.317999] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:09.887 BaseBdev1 00:11:09.887 13:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:09.887 13:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:10.146 BaseBdev2_malloc 00:11:10.146 13:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:10.406 true 00:11:10.406 13:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:10.665 [2024-07-12 13:36:59.063794] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:10.665 [2024-07-12 13:36:59.063838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:10.665 [2024-07-12 13:36:59.063860] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1380250 00:11:10.665 [2024-07-12 13:36:59.063872] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:10.665 [2024-07-12 13:36:59.065441] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:10.665 [2024-07-12 13:36:59.065468] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:10.665 BaseBdev2 00:11:10.665 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:10.924 [2024-07-12 13:36:59.292426] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:10.925 [2024-07-12 13:36:59.293749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:10.925 [2024-07-12 13:36:59.293938] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1381c60 00:11:10.925 [2024-07-12 13:36:59.293952] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:10.925 [2024-07-12 13:36:59.294144] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1380bb0 00:11:10.925 [2024-07-12 13:36:59.294287] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1381c60 00:11:10.925 [2024-07-12 13:36:59.294297] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1381c60 00:11:10.925 [2024-07-12 13:36:59.294401] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.925 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:11.184 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:11.184 "name": "raid_bdev1", 00:11:11.184 "uuid": "557d0dfb-5576-47b5-9699-5c331f615814", 00:11:11.184 "strip_size_kb": 64, 00:11:11.184 "state": "online", 00:11:11.184 "raid_level": "raid0", 00:11:11.184 "superblock": true, 00:11:11.184 "num_base_bdevs": 2, 00:11:11.184 "num_base_bdevs_discovered": 2, 00:11:11.184 "num_base_bdevs_operational": 2, 00:11:11.184 "base_bdevs_list": [ 00:11:11.184 { 00:11:11.184 "name": "BaseBdev1", 00:11:11.184 "uuid": "536e52e7-43ed-5a38-9239-cbcdb0f4fde0", 00:11:11.184 "is_configured": true, 00:11:11.184 "data_offset": 2048, 00:11:11.184 "data_size": 63488 00:11:11.184 }, 00:11:11.184 { 00:11:11.184 "name": "BaseBdev2", 00:11:11.184 "uuid": "fa394264-e801-5b49-a93c-511efa0d293c", 00:11:11.184 "is_configured": true, 00:11:11.184 "data_offset": 2048, 00:11:11.184 "data_size": 63488 00:11:11.184 } 00:11:11.184 ] 00:11:11.184 }' 00:11:11.184 13:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:11.184 13:36:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.752 13:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:11.752 13:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:11.752 [2024-07-12 13:37:00.263301] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x137d2f0 00:11:12.690 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:12.949 "name": "raid_bdev1", 00:11:12.949 "uuid": "557d0dfb-5576-47b5-9699-5c331f615814", 00:11:12.949 "strip_size_kb": 64, 00:11:12.949 "state": "online", 00:11:12.949 "raid_level": "raid0", 00:11:12.949 "superblock": true, 00:11:12.949 "num_base_bdevs": 2, 00:11:12.949 "num_base_bdevs_discovered": 2, 00:11:12.949 "num_base_bdevs_operational": 2, 00:11:12.949 "base_bdevs_list": [ 00:11:12.949 { 00:11:12.949 "name": "BaseBdev1", 00:11:12.949 "uuid": "536e52e7-43ed-5a38-9239-cbcdb0f4fde0", 00:11:12.949 "is_configured": true, 00:11:12.949 "data_offset": 2048, 00:11:12.949 "data_size": 63488 00:11:12.949 }, 00:11:12.949 { 00:11:12.949 "name": "BaseBdev2", 00:11:12.949 "uuid": "fa394264-e801-5b49-a93c-511efa0d293c", 00:11:12.949 "is_configured": true, 00:11:12.949 "data_offset": 2048, 00:11:12.949 "data_size": 63488 00:11:12.949 } 00:11:12.949 ] 00:11:12.949 }' 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:12.949 13:37:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:13.888 [2024-07-12 13:37:02.273764] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:13.888 [2024-07-12 13:37:02.273810] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:13.888 [2024-07-12 13:37:02.277034] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:13.888 [2024-07-12 13:37:02.277065] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:13.888 [2024-07-12 13:37:02.277093] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:13.888 [2024-07-12 13:37:02.277104] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1381c60 name raid_bdev1, state offline 00:11:13.888 0 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 431865 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 431865 ']' 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 431865 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 431865 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 431865' 00:11:13.888 killing process with pid 431865 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 431865 00:11:13.888 [2024-07-12 13:37:02.355548] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:13.888 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 431865 00:11:13.888 [2024-07-12 13:37:02.365948] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:14.148 13:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.og22n7Lqpk 00:11:14.148 13:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:14.148 13:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:14.148 13:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:11:14.148 13:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:14.148 13:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:14.148 13:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:14.148 13:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:14.148 00:11:14.148 real 0m6.006s 00:11:14.148 user 0m9.344s 00:11:14.148 sys 0m1.053s 00:11:14.148 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:14.148 13:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.148 ************************************ 00:11:14.148 END TEST raid_write_error_test 00:11:14.148 ************************************ 00:11:14.148 13:37:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:14.148 13:37:02 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:14.148 13:37:02 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:11:14.148 13:37:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:14.148 13:37:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:14.148 13:37:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:14.148 ************************************ 00:11:14.148 START TEST raid_state_function_test 00:11:14.148 ************************************ 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=432674 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 432674' 00:11:14.148 Process raid pid: 432674 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 432674 /var/tmp/spdk-raid.sock 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 432674 ']' 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:14.148 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:14.148 13:37:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.408 [2024-07-12 13:37:02.760533] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:11:14.408 [2024-07-12 13:37:02.760601] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:14.408 [2024-07-12 13:37:02.894226] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:14.666 [2024-07-12 13:37:03.007640] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:14.666 [2024-07-12 13:37:03.071550] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:14.666 [2024-07-12 13:37:03.071580] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:14.666 13:37:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:14.666 13:37:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:14.666 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:14.925 [2024-07-12 13:37:03.384731] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:14.925 [2024-07-12 13:37:03.384774] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:14.925 [2024-07-12 13:37:03.384784] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:14.925 [2024-07-12 13:37:03.384796] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.925 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:15.185 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:15.185 "name": "Existed_Raid", 00:11:15.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.185 "strip_size_kb": 64, 00:11:15.185 "state": "configuring", 00:11:15.185 "raid_level": "concat", 00:11:15.185 "superblock": false, 00:11:15.185 "num_base_bdevs": 2, 00:11:15.185 "num_base_bdevs_discovered": 0, 00:11:15.185 "num_base_bdevs_operational": 2, 00:11:15.185 "base_bdevs_list": [ 00:11:15.185 { 00:11:15.185 "name": "BaseBdev1", 00:11:15.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.185 "is_configured": false, 00:11:15.185 "data_offset": 0, 00:11:15.185 "data_size": 0 00:11:15.185 }, 00:11:15.185 { 00:11:15.185 "name": "BaseBdev2", 00:11:15.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.185 "is_configured": false, 00:11:15.185 "data_offset": 0, 00:11:15.185 "data_size": 0 00:11:15.185 } 00:11:15.185 ] 00:11:15.185 }' 00:11:15.185 13:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:15.185 13:37:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.753 13:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:16.013 [2024-07-12 13:37:04.343148] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:16.013 [2024-07-12 13:37:04.343179] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217d330 name Existed_Raid, state configuring 00:11:16.013 13:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:16.013 [2024-07-12 13:37:04.591813] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:16.013 [2024-07-12 13:37:04.591843] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:16.013 [2024-07-12 13:37:04.591853] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:16.013 [2024-07-12 13:37:04.591864] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:16.272 13:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:16.272 [2024-07-12 13:37:04.778246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:16.272 BaseBdev1 00:11:16.272 13:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:16.272 13:37:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:16.272 13:37:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:16.272 13:37:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:16.272 13:37:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:16.272 13:37:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:16.272 13:37:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:16.530 13:37:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:16.795 [ 00:11:16.795 { 00:11:16.795 "name": "BaseBdev1", 00:11:16.795 "aliases": [ 00:11:16.795 "ac40d358-5610-4879-ae15-88edd06d1aea" 00:11:16.795 ], 00:11:16.795 "product_name": "Malloc disk", 00:11:16.795 "block_size": 512, 00:11:16.795 "num_blocks": 65536, 00:11:16.795 "uuid": "ac40d358-5610-4879-ae15-88edd06d1aea", 00:11:16.795 "assigned_rate_limits": { 00:11:16.795 "rw_ios_per_sec": 0, 00:11:16.795 "rw_mbytes_per_sec": 0, 00:11:16.795 "r_mbytes_per_sec": 0, 00:11:16.795 "w_mbytes_per_sec": 0 00:11:16.795 }, 00:11:16.795 "claimed": true, 00:11:16.795 "claim_type": "exclusive_write", 00:11:16.795 "zoned": false, 00:11:16.795 "supported_io_types": { 00:11:16.795 "read": true, 00:11:16.795 "write": true, 00:11:16.795 "unmap": true, 00:11:16.795 "flush": true, 00:11:16.795 "reset": true, 00:11:16.795 "nvme_admin": false, 00:11:16.795 "nvme_io": false, 00:11:16.795 "nvme_io_md": false, 00:11:16.795 "write_zeroes": true, 00:11:16.795 "zcopy": true, 00:11:16.795 "get_zone_info": false, 00:11:16.795 "zone_management": false, 00:11:16.795 "zone_append": false, 00:11:16.795 "compare": false, 00:11:16.795 "compare_and_write": false, 00:11:16.795 "abort": true, 00:11:16.795 "seek_hole": false, 00:11:16.795 "seek_data": false, 00:11:16.795 "copy": true, 00:11:16.795 "nvme_iov_md": false 00:11:16.795 }, 00:11:16.795 "memory_domains": [ 00:11:16.795 { 00:11:16.795 "dma_device_id": "system", 00:11:16.795 "dma_device_type": 1 00:11:16.795 }, 00:11:16.795 { 00:11:16.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:16.795 "dma_device_type": 2 00:11:16.795 } 00:11:16.795 ], 00:11:16.795 "driver_specific": {} 00:11:16.795 } 00:11:16.795 ] 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:16.795 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:16.795 "name": "Existed_Raid", 00:11:16.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:16.795 "strip_size_kb": 64, 00:11:16.795 "state": "configuring", 00:11:16.795 "raid_level": "concat", 00:11:16.795 "superblock": false, 00:11:16.795 "num_base_bdevs": 2, 00:11:16.795 "num_base_bdevs_discovered": 1, 00:11:16.795 "num_base_bdevs_operational": 2, 00:11:16.795 "base_bdevs_list": [ 00:11:16.795 { 00:11:16.796 "name": "BaseBdev1", 00:11:16.796 "uuid": "ac40d358-5610-4879-ae15-88edd06d1aea", 00:11:16.796 "is_configured": true, 00:11:16.796 "data_offset": 0, 00:11:16.796 "data_size": 65536 00:11:16.796 }, 00:11:16.796 { 00:11:16.796 "name": "BaseBdev2", 00:11:16.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:16.796 "is_configured": false, 00:11:16.796 "data_offset": 0, 00:11:16.796 "data_size": 0 00:11:16.796 } 00:11:16.796 ] 00:11:16.796 }' 00:11:16.796 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:16.796 13:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.364 13:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:17.622 [2024-07-12 13:37:06.093747] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:17.622 [2024-07-12 13:37:06.093786] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217cc20 name Existed_Raid, state configuring 00:11:17.622 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:17.880 [2024-07-12 13:37:06.266234] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:17.880 [2024-07-12 13:37:06.267713] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:17.880 [2024-07-12 13:37:06.267747] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.880 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:18.138 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:18.138 "name": "Existed_Raid", 00:11:18.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.138 "strip_size_kb": 64, 00:11:18.138 "state": "configuring", 00:11:18.138 "raid_level": "concat", 00:11:18.138 "superblock": false, 00:11:18.138 "num_base_bdevs": 2, 00:11:18.138 "num_base_bdevs_discovered": 1, 00:11:18.138 "num_base_bdevs_operational": 2, 00:11:18.138 "base_bdevs_list": [ 00:11:18.138 { 00:11:18.138 "name": "BaseBdev1", 00:11:18.138 "uuid": "ac40d358-5610-4879-ae15-88edd06d1aea", 00:11:18.138 "is_configured": true, 00:11:18.138 "data_offset": 0, 00:11:18.138 "data_size": 65536 00:11:18.138 }, 00:11:18.138 { 00:11:18.138 "name": "BaseBdev2", 00:11:18.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.138 "is_configured": false, 00:11:18.138 "data_offset": 0, 00:11:18.138 "data_size": 0 00:11:18.138 } 00:11:18.138 ] 00:11:18.138 }' 00:11:18.138 13:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:18.138 13:37:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:18.705 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:18.964 [2024-07-12 13:37:07.321592] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:18.964 [2024-07-12 13:37:07.321632] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x217da10 00:11:18.964 [2024-07-12 13:37:07.321641] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:18.964 [2024-07-12 13:37:07.321836] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23213b0 00:11:18.964 [2024-07-12 13:37:07.321965] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x217da10 00:11:18.964 [2024-07-12 13:37:07.321976] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x217da10 00:11:18.964 [2024-07-12 13:37:07.322152] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:18.964 BaseBdev2 00:11:18.964 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:18.964 13:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:18.964 13:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:18.964 13:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:18.964 13:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:18.964 13:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:18.964 13:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:19.222 13:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:19.481 [ 00:11:19.481 { 00:11:19.481 "name": "BaseBdev2", 00:11:19.481 "aliases": [ 00:11:19.481 "df27922a-89fa-4db0-b07b-0072309fc8cb" 00:11:19.481 ], 00:11:19.481 "product_name": "Malloc disk", 00:11:19.481 "block_size": 512, 00:11:19.481 "num_blocks": 65536, 00:11:19.481 "uuid": "df27922a-89fa-4db0-b07b-0072309fc8cb", 00:11:19.481 "assigned_rate_limits": { 00:11:19.481 "rw_ios_per_sec": 0, 00:11:19.481 "rw_mbytes_per_sec": 0, 00:11:19.481 "r_mbytes_per_sec": 0, 00:11:19.481 "w_mbytes_per_sec": 0 00:11:19.481 }, 00:11:19.481 "claimed": true, 00:11:19.481 "claim_type": "exclusive_write", 00:11:19.481 "zoned": false, 00:11:19.481 "supported_io_types": { 00:11:19.481 "read": true, 00:11:19.481 "write": true, 00:11:19.481 "unmap": true, 00:11:19.481 "flush": true, 00:11:19.481 "reset": true, 00:11:19.481 "nvme_admin": false, 00:11:19.481 "nvme_io": false, 00:11:19.481 "nvme_io_md": false, 00:11:19.481 "write_zeroes": true, 00:11:19.481 "zcopy": true, 00:11:19.481 "get_zone_info": false, 00:11:19.481 "zone_management": false, 00:11:19.481 "zone_append": false, 00:11:19.481 "compare": false, 00:11:19.481 "compare_and_write": false, 00:11:19.481 "abort": true, 00:11:19.481 "seek_hole": false, 00:11:19.481 "seek_data": false, 00:11:19.481 "copy": true, 00:11:19.481 "nvme_iov_md": false 00:11:19.481 }, 00:11:19.481 "memory_domains": [ 00:11:19.481 { 00:11:19.481 "dma_device_id": "system", 00:11:19.481 "dma_device_type": 1 00:11:19.481 }, 00:11:19.481 { 00:11:19.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:19.481 "dma_device_type": 2 00:11:19.481 } 00:11:19.481 ], 00:11:19.481 "driver_specific": {} 00:11:19.481 } 00:11:19.481 ] 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:19.481 13:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:19.740 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:19.740 "name": "Existed_Raid", 00:11:19.740 "uuid": "b58da3ba-57fd-484b-92df-82bdbc1b0b23", 00:11:19.740 "strip_size_kb": 64, 00:11:19.740 "state": "online", 00:11:19.740 "raid_level": "concat", 00:11:19.740 "superblock": false, 00:11:19.740 "num_base_bdevs": 2, 00:11:19.740 "num_base_bdevs_discovered": 2, 00:11:19.740 "num_base_bdevs_operational": 2, 00:11:19.740 "base_bdevs_list": [ 00:11:19.740 { 00:11:19.740 "name": "BaseBdev1", 00:11:19.740 "uuid": "ac40d358-5610-4879-ae15-88edd06d1aea", 00:11:19.740 "is_configured": true, 00:11:19.740 "data_offset": 0, 00:11:19.740 "data_size": 65536 00:11:19.740 }, 00:11:19.740 { 00:11:19.740 "name": "BaseBdev2", 00:11:19.740 "uuid": "df27922a-89fa-4db0-b07b-0072309fc8cb", 00:11:19.740 "is_configured": true, 00:11:19.740 "data_offset": 0, 00:11:19.740 "data_size": 65536 00:11:19.740 } 00:11:19.740 ] 00:11:19.740 }' 00:11:19.740 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:19.740 13:37:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:20.309 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:20.309 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:20.309 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:20.309 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:20.309 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:20.309 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:20.309 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:20.309 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:20.569 [2024-07-12 13:37:08.930133] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:20.569 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:20.569 "name": "Existed_Raid", 00:11:20.569 "aliases": [ 00:11:20.569 "b58da3ba-57fd-484b-92df-82bdbc1b0b23" 00:11:20.569 ], 00:11:20.569 "product_name": "Raid Volume", 00:11:20.569 "block_size": 512, 00:11:20.569 "num_blocks": 131072, 00:11:20.569 "uuid": "b58da3ba-57fd-484b-92df-82bdbc1b0b23", 00:11:20.569 "assigned_rate_limits": { 00:11:20.569 "rw_ios_per_sec": 0, 00:11:20.569 "rw_mbytes_per_sec": 0, 00:11:20.569 "r_mbytes_per_sec": 0, 00:11:20.569 "w_mbytes_per_sec": 0 00:11:20.569 }, 00:11:20.569 "claimed": false, 00:11:20.569 "zoned": false, 00:11:20.569 "supported_io_types": { 00:11:20.569 "read": true, 00:11:20.569 "write": true, 00:11:20.569 "unmap": true, 00:11:20.569 "flush": true, 00:11:20.569 "reset": true, 00:11:20.569 "nvme_admin": false, 00:11:20.569 "nvme_io": false, 00:11:20.569 "nvme_io_md": false, 00:11:20.569 "write_zeroes": true, 00:11:20.569 "zcopy": false, 00:11:20.569 "get_zone_info": false, 00:11:20.569 "zone_management": false, 00:11:20.569 "zone_append": false, 00:11:20.569 "compare": false, 00:11:20.569 "compare_and_write": false, 00:11:20.569 "abort": false, 00:11:20.569 "seek_hole": false, 00:11:20.569 "seek_data": false, 00:11:20.569 "copy": false, 00:11:20.569 "nvme_iov_md": false 00:11:20.569 }, 00:11:20.569 "memory_domains": [ 00:11:20.569 { 00:11:20.569 "dma_device_id": "system", 00:11:20.569 "dma_device_type": 1 00:11:20.569 }, 00:11:20.569 { 00:11:20.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.569 "dma_device_type": 2 00:11:20.569 }, 00:11:20.569 { 00:11:20.569 "dma_device_id": "system", 00:11:20.569 "dma_device_type": 1 00:11:20.569 }, 00:11:20.569 { 00:11:20.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.569 "dma_device_type": 2 00:11:20.569 } 00:11:20.569 ], 00:11:20.569 "driver_specific": { 00:11:20.569 "raid": { 00:11:20.569 "uuid": "b58da3ba-57fd-484b-92df-82bdbc1b0b23", 00:11:20.569 "strip_size_kb": 64, 00:11:20.569 "state": "online", 00:11:20.569 "raid_level": "concat", 00:11:20.569 "superblock": false, 00:11:20.569 "num_base_bdevs": 2, 00:11:20.569 "num_base_bdevs_discovered": 2, 00:11:20.569 "num_base_bdevs_operational": 2, 00:11:20.569 "base_bdevs_list": [ 00:11:20.569 { 00:11:20.569 "name": "BaseBdev1", 00:11:20.569 "uuid": "ac40d358-5610-4879-ae15-88edd06d1aea", 00:11:20.569 "is_configured": true, 00:11:20.569 "data_offset": 0, 00:11:20.569 "data_size": 65536 00:11:20.569 }, 00:11:20.569 { 00:11:20.569 "name": "BaseBdev2", 00:11:20.569 "uuid": "df27922a-89fa-4db0-b07b-0072309fc8cb", 00:11:20.569 "is_configured": true, 00:11:20.569 "data_offset": 0, 00:11:20.569 "data_size": 65536 00:11:20.569 } 00:11:20.569 ] 00:11:20.569 } 00:11:20.569 } 00:11:20.569 }' 00:11:20.569 13:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:20.569 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:20.569 BaseBdev2' 00:11:20.569 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:20.569 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:20.569 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:20.829 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:20.829 "name": "BaseBdev1", 00:11:20.829 "aliases": [ 00:11:20.829 "ac40d358-5610-4879-ae15-88edd06d1aea" 00:11:20.829 ], 00:11:20.829 "product_name": "Malloc disk", 00:11:20.829 "block_size": 512, 00:11:20.829 "num_blocks": 65536, 00:11:20.829 "uuid": "ac40d358-5610-4879-ae15-88edd06d1aea", 00:11:20.829 "assigned_rate_limits": { 00:11:20.829 "rw_ios_per_sec": 0, 00:11:20.829 "rw_mbytes_per_sec": 0, 00:11:20.829 "r_mbytes_per_sec": 0, 00:11:20.829 "w_mbytes_per_sec": 0 00:11:20.829 }, 00:11:20.829 "claimed": true, 00:11:20.829 "claim_type": "exclusive_write", 00:11:20.829 "zoned": false, 00:11:20.829 "supported_io_types": { 00:11:20.829 "read": true, 00:11:20.829 "write": true, 00:11:20.829 "unmap": true, 00:11:20.829 "flush": true, 00:11:20.829 "reset": true, 00:11:20.829 "nvme_admin": false, 00:11:20.829 "nvme_io": false, 00:11:20.829 "nvme_io_md": false, 00:11:20.829 "write_zeroes": true, 00:11:20.829 "zcopy": true, 00:11:20.829 "get_zone_info": false, 00:11:20.829 "zone_management": false, 00:11:20.829 "zone_append": false, 00:11:20.829 "compare": false, 00:11:20.829 "compare_and_write": false, 00:11:20.829 "abort": true, 00:11:20.829 "seek_hole": false, 00:11:20.829 "seek_data": false, 00:11:20.829 "copy": true, 00:11:20.829 "nvme_iov_md": false 00:11:20.829 }, 00:11:20.829 "memory_domains": [ 00:11:20.829 { 00:11:20.829 "dma_device_id": "system", 00:11:20.829 "dma_device_type": 1 00:11:20.829 }, 00:11:20.829 { 00:11:20.829 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.829 "dma_device_type": 2 00:11:20.829 } 00:11:20.829 ], 00:11:20.829 "driver_specific": {} 00:11:20.829 }' 00:11:20.829 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:20.829 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:20.829 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:20.829 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:20.829 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.088 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:21.088 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.088 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.088 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:21.088 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.088 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.088 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:21.088 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:21.088 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:21.088 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:21.348 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:21.348 "name": "BaseBdev2", 00:11:21.348 "aliases": [ 00:11:21.348 "df27922a-89fa-4db0-b07b-0072309fc8cb" 00:11:21.348 ], 00:11:21.348 "product_name": "Malloc disk", 00:11:21.348 "block_size": 512, 00:11:21.348 "num_blocks": 65536, 00:11:21.348 "uuid": "df27922a-89fa-4db0-b07b-0072309fc8cb", 00:11:21.348 "assigned_rate_limits": { 00:11:21.348 "rw_ios_per_sec": 0, 00:11:21.348 "rw_mbytes_per_sec": 0, 00:11:21.348 "r_mbytes_per_sec": 0, 00:11:21.348 "w_mbytes_per_sec": 0 00:11:21.348 }, 00:11:21.348 "claimed": true, 00:11:21.348 "claim_type": "exclusive_write", 00:11:21.348 "zoned": false, 00:11:21.348 "supported_io_types": { 00:11:21.348 "read": true, 00:11:21.348 "write": true, 00:11:21.348 "unmap": true, 00:11:21.348 "flush": true, 00:11:21.348 "reset": true, 00:11:21.348 "nvme_admin": false, 00:11:21.348 "nvme_io": false, 00:11:21.348 "nvme_io_md": false, 00:11:21.348 "write_zeroes": true, 00:11:21.348 "zcopy": true, 00:11:21.348 "get_zone_info": false, 00:11:21.348 "zone_management": false, 00:11:21.348 "zone_append": false, 00:11:21.348 "compare": false, 00:11:21.348 "compare_and_write": false, 00:11:21.348 "abort": true, 00:11:21.348 "seek_hole": false, 00:11:21.348 "seek_data": false, 00:11:21.348 "copy": true, 00:11:21.348 "nvme_iov_md": false 00:11:21.348 }, 00:11:21.348 "memory_domains": [ 00:11:21.348 { 00:11:21.348 "dma_device_id": "system", 00:11:21.348 "dma_device_type": 1 00:11:21.348 }, 00:11:21.348 { 00:11:21.348 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.348 "dma_device_type": 2 00:11:21.348 } 00:11:21.348 ], 00:11:21.348 "driver_specific": {} 00:11:21.348 }' 00:11:21.348 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.608 13:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.608 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:21.608 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.608 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.608 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:21.608 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.608 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.867 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:21.867 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.867 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.867 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:21.867 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:22.127 [2024-07-12 13:37:10.518109] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:22.127 [2024-07-12 13:37:10.518138] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:22.127 [2024-07-12 13:37:10.518180] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.127 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:22.387 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.387 "name": "Existed_Raid", 00:11:22.387 "uuid": "b58da3ba-57fd-484b-92df-82bdbc1b0b23", 00:11:22.387 "strip_size_kb": 64, 00:11:22.387 "state": "offline", 00:11:22.387 "raid_level": "concat", 00:11:22.387 "superblock": false, 00:11:22.387 "num_base_bdevs": 2, 00:11:22.387 "num_base_bdevs_discovered": 1, 00:11:22.387 "num_base_bdevs_operational": 1, 00:11:22.387 "base_bdevs_list": [ 00:11:22.387 { 00:11:22.387 "name": null, 00:11:22.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.387 "is_configured": false, 00:11:22.387 "data_offset": 0, 00:11:22.387 "data_size": 65536 00:11:22.387 }, 00:11:22.387 { 00:11:22.387 "name": "BaseBdev2", 00:11:22.387 "uuid": "df27922a-89fa-4db0-b07b-0072309fc8cb", 00:11:22.387 "is_configured": true, 00:11:22.387 "data_offset": 0, 00:11:22.387 "data_size": 65536 00:11:22.387 } 00:11:22.387 ] 00:11:22.387 }' 00:11:22.387 13:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.387 13:37:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:22.953 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:22.953 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:22.953 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.953 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:23.213 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:23.213 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:23.213 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:23.472 [2024-07-12 13:37:11.902850] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:23.472 [2024-07-12 13:37:11.902909] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217da10 name Existed_Raid, state offline 00:11:23.472 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:23.472 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:23.472 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.472 13:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 432674 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 432674 ']' 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 432674 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 432674 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 432674' 00:11:23.731 killing process with pid 432674 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 432674 00:11:23.731 [2024-07-12 13:37:12.220228] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:23.731 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 432674 00:11:23.731 [2024-07-12 13:37:12.221225] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:23.991 13:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:23.992 00:11:23.992 real 0m9.760s 00:11:23.992 user 0m17.704s 00:11:23.992 sys 0m1.926s 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.992 ************************************ 00:11:23.992 END TEST raid_state_function_test 00:11:23.992 ************************************ 00:11:23.992 13:37:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:23.992 13:37:12 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:11:23.992 13:37:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:23.992 13:37:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:23.992 13:37:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:23.992 ************************************ 00:11:23.992 START TEST raid_state_function_test_sb 00:11:23.992 ************************************ 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=434300 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 434300' 00:11:23.992 Process raid pid: 434300 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 434300 /var/tmp/spdk-raid.sock 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 434300 ']' 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:23.992 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:23.992 13:37:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:24.252 [2024-07-12 13:37:12.602981] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:11:24.252 [2024-07-12 13:37:12.603037] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:24.252 [2024-07-12 13:37:12.718267] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:24.252 [2024-07-12 13:37:12.823418] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:24.511 [2024-07-12 13:37:12.888070] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:24.511 [2024-07-12 13:37:12.888116] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:25.079 13:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:25.079 13:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:25.079 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:25.337 [2024-07-12 13:37:13.702600] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:25.337 [2024-07-12 13:37:13.702644] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:25.337 [2024-07-12 13:37:13.702655] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:25.337 [2024-07-12 13:37:13.702667] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.337 "name": "Existed_Raid", 00:11:25.337 "uuid": "9a274739-fbe8-4f49-868d-9eebd3d8f8a3", 00:11:25.337 "strip_size_kb": 64, 00:11:25.337 "state": "configuring", 00:11:25.337 "raid_level": "concat", 00:11:25.337 "superblock": true, 00:11:25.337 "num_base_bdevs": 2, 00:11:25.337 "num_base_bdevs_discovered": 0, 00:11:25.337 "num_base_bdevs_operational": 2, 00:11:25.337 "base_bdevs_list": [ 00:11:25.337 { 00:11:25.337 "name": "BaseBdev1", 00:11:25.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.337 "is_configured": false, 00:11:25.337 "data_offset": 0, 00:11:25.337 "data_size": 0 00:11:25.337 }, 00:11:25.337 { 00:11:25.337 "name": "BaseBdev2", 00:11:25.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.337 "is_configured": false, 00:11:25.337 "data_offset": 0, 00:11:25.337 "data_size": 0 00:11:25.337 } 00:11:25.337 ] 00:11:25.337 }' 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.337 13:37:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:26.273 13:37:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:26.532 [2024-07-12 13:37:15.029960] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:26.532 [2024-07-12 13:37:15.030000] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20d9330 name Existed_Raid, state configuring 00:11:26.532 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:26.792 [2024-07-12 13:37:15.206431] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:26.792 [2024-07-12 13:37:15.206463] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:26.792 [2024-07-12 13:37:15.206473] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:26.792 [2024-07-12 13:37:15.206485] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:26.792 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:27.052 [2024-07-12 13:37:15.466274] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:27.052 BaseBdev1 00:11:27.052 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:27.052 13:37:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:27.052 13:37:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:27.052 13:37:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:27.052 13:37:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:27.052 13:37:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:27.052 13:37:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:27.311 13:37:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:27.570 [ 00:11:27.570 { 00:11:27.570 "name": "BaseBdev1", 00:11:27.570 "aliases": [ 00:11:27.570 "4f213aed-cf16-4b56-893a-4392f99f631a" 00:11:27.570 ], 00:11:27.570 "product_name": "Malloc disk", 00:11:27.570 "block_size": 512, 00:11:27.570 "num_blocks": 65536, 00:11:27.570 "uuid": "4f213aed-cf16-4b56-893a-4392f99f631a", 00:11:27.570 "assigned_rate_limits": { 00:11:27.570 "rw_ios_per_sec": 0, 00:11:27.570 "rw_mbytes_per_sec": 0, 00:11:27.570 "r_mbytes_per_sec": 0, 00:11:27.570 "w_mbytes_per_sec": 0 00:11:27.570 }, 00:11:27.570 "claimed": true, 00:11:27.570 "claim_type": "exclusive_write", 00:11:27.570 "zoned": false, 00:11:27.570 "supported_io_types": { 00:11:27.570 "read": true, 00:11:27.570 "write": true, 00:11:27.570 "unmap": true, 00:11:27.570 "flush": true, 00:11:27.570 "reset": true, 00:11:27.570 "nvme_admin": false, 00:11:27.570 "nvme_io": false, 00:11:27.570 "nvme_io_md": false, 00:11:27.570 "write_zeroes": true, 00:11:27.570 "zcopy": true, 00:11:27.570 "get_zone_info": false, 00:11:27.570 "zone_management": false, 00:11:27.570 "zone_append": false, 00:11:27.570 "compare": false, 00:11:27.570 "compare_and_write": false, 00:11:27.570 "abort": true, 00:11:27.570 "seek_hole": false, 00:11:27.570 "seek_data": false, 00:11:27.570 "copy": true, 00:11:27.570 "nvme_iov_md": false 00:11:27.570 }, 00:11:27.570 "memory_domains": [ 00:11:27.570 { 00:11:27.570 "dma_device_id": "system", 00:11:27.570 "dma_device_type": 1 00:11:27.570 }, 00:11:27.570 { 00:11:27.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.570 "dma_device_type": 2 00:11:27.570 } 00:11:27.570 ], 00:11:27.570 "driver_specific": {} 00:11:27.570 } 00:11:27.570 ] 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:27.570 13:37:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:27.830 13:37:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:27.830 "name": "Existed_Raid", 00:11:27.830 "uuid": "3aff3925-d489-4ac5-a282-78354c44351f", 00:11:27.830 "strip_size_kb": 64, 00:11:27.830 "state": "configuring", 00:11:27.830 "raid_level": "concat", 00:11:27.830 "superblock": true, 00:11:27.830 "num_base_bdevs": 2, 00:11:27.830 "num_base_bdevs_discovered": 1, 00:11:27.830 "num_base_bdevs_operational": 2, 00:11:27.830 "base_bdevs_list": [ 00:11:27.830 { 00:11:27.830 "name": "BaseBdev1", 00:11:27.830 "uuid": "4f213aed-cf16-4b56-893a-4392f99f631a", 00:11:27.830 "is_configured": true, 00:11:27.830 "data_offset": 2048, 00:11:27.830 "data_size": 63488 00:11:27.830 }, 00:11:27.830 { 00:11:27.830 "name": "BaseBdev2", 00:11:27.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:27.830 "is_configured": false, 00:11:27.830 "data_offset": 0, 00:11:27.830 "data_size": 0 00:11:27.830 } 00:11:27.830 ] 00:11:27.830 }' 00:11:27.830 13:37:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:27.830 13:37:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:28.399 13:37:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:28.657 [2024-07-12 13:37:17.030425] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:28.657 [2024-07-12 13:37:17.030469] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20d8c20 name Existed_Raid, state configuring 00:11:28.657 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:28.916 [2024-07-12 13:37:17.279120] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:28.916 [2024-07-12 13:37:17.280611] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:28.916 [2024-07-12 13:37:17.280644] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:28.916 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:28.917 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:28.917 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:28.917 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.175 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:29.175 "name": "Existed_Raid", 00:11:29.175 "uuid": "302dcdac-a85a-4b36-b00a-abbfdc128774", 00:11:29.175 "strip_size_kb": 64, 00:11:29.175 "state": "configuring", 00:11:29.175 "raid_level": "concat", 00:11:29.175 "superblock": true, 00:11:29.175 "num_base_bdevs": 2, 00:11:29.175 "num_base_bdevs_discovered": 1, 00:11:29.175 "num_base_bdevs_operational": 2, 00:11:29.175 "base_bdevs_list": [ 00:11:29.175 { 00:11:29.175 "name": "BaseBdev1", 00:11:29.175 "uuid": "4f213aed-cf16-4b56-893a-4392f99f631a", 00:11:29.175 "is_configured": true, 00:11:29.175 "data_offset": 2048, 00:11:29.175 "data_size": 63488 00:11:29.175 }, 00:11:29.175 { 00:11:29.175 "name": "BaseBdev2", 00:11:29.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:29.175 "is_configured": false, 00:11:29.175 "data_offset": 0, 00:11:29.175 "data_size": 0 00:11:29.175 } 00:11:29.175 ] 00:11:29.175 }' 00:11:29.175 13:37:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:29.175 13:37:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:29.740 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:29.999 [2024-07-12 13:37:18.393425] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:29.999 [2024-07-12 13:37:18.393577] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20d9a10 00:11:29.999 [2024-07-12 13:37:18.393591] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:29.999 [2024-07-12 13:37:18.393767] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20d8b70 00:11:29.999 [2024-07-12 13:37:18.393882] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20d9a10 00:11:29.999 [2024-07-12 13:37:18.393892] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20d9a10 00:11:29.999 [2024-07-12 13:37:18.394000] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:29.999 BaseBdev2 00:11:29.999 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:29.999 13:37:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:29.999 13:37:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:29.999 13:37:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:29.999 13:37:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:29.999 13:37:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:29.999 13:37:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:30.257 13:37:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:30.514 [ 00:11:30.514 { 00:11:30.514 "name": "BaseBdev2", 00:11:30.514 "aliases": [ 00:11:30.514 "d461989a-180a-478a-aefa-7d5b72801aac" 00:11:30.514 ], 00:11:30.514 "product_name": "Malloc disk", 00:11:30.514 "block_size": 512, 00:11:30.514 "num_blocks": 65536, 00:11:30.514 "uuid": "d461989a-180a-478a-aefa-7d5b72801aac", 00:11:30.514 "assigned_rate_limits": { 00:11:30.514 "rw_ios_per_sec": 0, 00:11:30.514 "rw_mbytes_per_sec": 0, 00:11:30.514 "r_mbytes_per_sec": 0, 00:11:30.514 "w_mbytes_per_sec": 0 00:11:30.514 }, 00:11:30.514 "claimed": true, 00:11:30.514 "claim_type": "exclusive_write", 00:11:30.514 "zoned": false, 00:11:30.514 "supported_io_types": { 00:11:30.514 "read": true, 00:11:30.514 "write": true, 00:11:30.514 "unmap": true, 00:11:30.514 "flush": true, 00:11:30.514 "reset": true, 00:11:30.514 "nvme_admin": false, 00:11:30.514 "nvme_io": false, 00:11:30.514 "nvme_io_md": false, 00:11:30.514 "write_zeroes": true, 00:11:30.514 "zcopy": true, 00:11:30.514 "get_zone_info": false, 00:11:30.514 "zone_management": false, 00:11:30.514 "zone_append": false, 00:11:30.514 "compare": false, 00:11:30.514 "compare_and_write": false, 00:11:30.514 "abort": true, 00:11:30.514 "seek_hole": false, 00:11:30.514 "seek_data": false, 00:11:30.514 "copy": true, 00:11:30.514 "nvme_iov_md": false 00:11:30.514 }, 00:11:30.514 "memory_domains": [ 00:11:30.514 { 00:11:30.514 "dma_device_id": "system", 00:11:30.514 "dma_device_type": 1 00:11:30.514 }, 00:11:30.514 { 00:11:30.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:30.514 "dma_device_type": 2 00:11:30.514 } 00:11:30.514 ], 00:11:30.514 "driver_specific": {} 00:11:30.514 } 00:11:30.514 ] 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.514 13:37:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:30.772 13:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:30.772 "name": "Existed_Raid", 00:11:30.772 "uuid": "302dcdac-a85a-4b36-b00a-abbfdc128774", 00:11:30.772 "strip_size_kb": 64, 00:11:30.772 "state": "online", 00:11:30.772 "raid_level": "concat", 00:11:30.772 "superblock": true, 00:11:30.772 "num_base_bdevs": 2, 00:11:30.772 "num_base_bdevs_discovered": 2, 00:11:30.772 "num_base_bdevs_operational": 2, 00:11:30.772 "base_bdevs_list": [ 00:11:30.772 { 00:11:30.772 "name": "BaseBdev1", 00:11:30.772 "uuid": "4f213aed-cf16-4b56-893a-4392f99f631a", 00:11:30.772 "is_configured": true, 00:11:30.772 "data_offset": 2048, 00:11:30.772 "data_size": 63488 00:11:30.772 }, 00:11:30.772 { 00:11:30.772 "name": "BaseBdev2", 00:11:30.772 "uuid": "d461989a-180a-478a-aefa-7d5b72801aac", 00:11:30.772 "is_configured": true, 00:11:30.772 "data_offset": 2048, 00:11:30.772 "data_size": 63488 00:11:30.772 } 00:11:30.772 ] 00:11:30.772 }' 00:11:30.772 13:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:30.772 13:37:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:31.338 13:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:31.338 13:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:31.338 13:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:31.338 13:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:31.338 13:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:31.338 13:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:31.339 13:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:31.339 13:37:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:31.597 [2024-07-12 13:37:19.977916] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:31.597 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:31.597 "name": "Existed_Raid", 00:11:31.597 "aliases": [ 00:11:31.597 "302dcdac-a85a-4b36-b00a-abbfdc128774" 00:11:31.597 ], 00:11:31.597 "product_name": "Raid Volume", 00:11:31.597 "block_size": 512, 00:11:31.597 "num_blocks": 126976, 00:11:31.597 "uuid": "302dcdac-a85a-4b36-b00a-abbfdc128774", 00:11:31.597 "assigned_rate_limits": { 00:11:31.597 "rw_ios_per_sec": 0, 00:11:31.597 "rw_mbytes_per_sec": 0, 00:11:31.597 "r_mbytes_per_sec": 0, 00:11:31.597 "w_mbytes_per_sec": 0 00:11:31.597 }, 00:11:31.597 "claimed": false, 00:11:31.597 "zoned": false, 00:11:31.597 "supported_io_types": { 00:11:31.597 "read": true, 00:11:31.597 "write": true, 00:11:31.597 "unmap": true, 00:11:31.597 "flush": true, 00:11:31.597 "reset": true, 00:11:31.597 "nvme_admin": false, 00:11:31.597 "nvme_io": false, 00:11:31.597 "nvme_io_md": false, 00:11:31.597 "write_zeroes": true, 00:11:31.597 "zcopy": false, 00:11:31.597 "get_zone_info": false, 00:11:31.597 "zone_management": false, 00:11:31.597 "zone_append": false, 00:11:31.597 "compare": false, 00:11:31.597 "compare_and_write": false, 00:11:31.597 "abort": false, 00:11:31.597 "seek_hole": false, 00:11:31.597 "seek_data": false, 00:11:31.597 "copy": false, 00:11:31.597 "nvme_iov_md": false 00:11:31.597 }, 00:11:31.597 "memory_domains": [ 00:11:31.597 { 00:11:31.597 "dma_device_id": "system", 00:11:31.597 "dma_device_type": 1 00:11:31.597 }, 00:11:31.597 { 00:11:31.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.597 "dma_device_type": 2 00:11:31.597 }, 00:11:31.597 { 00:11:31.597 "dma_device_id": "system", 00:11:31.597 "dma_device_type": 1 00:11:31.597 }, 00:11:31.597 { 00:11:31.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.597 "dma_device_type": 2 00:11:31.597 } 00:11:31.597 ], 00:11:31.597 "driver_specific": { 00:11:31.597 "raid": { 00:11:31.597 "uuid": "302dcdac-a85a-4b36-b00a-abbfdc128774", 00:11:31.597 "strip_size_kb": 64, 00:11:31.597 "state": "online", 00:11:31.597 "raid_level": "concat", 00:11:31.597 "superblock": true, 00:11:31.597 "num_base_bdevs": 2, 00:11:31.597 "num_base_bdevs_discovered": 2, 00:11:31.597 "num_base_bdevs_operational": 2, 00:11:31.597 "base_bdevs_list": [ 00:11:31.597 { 00:11:31.597 "name": "BaseBdev1", 00:11:31.597 "uuid": "4f213aed-cf16-4b56-893a-4392f99f631a", 00:11:31.597 "is_configured": true, 00:11:31.597 "data_offset": 2048, 00:11:31.597 "data_size": 63488 00:11:31.597 }, 00:11:31.597 { 00:11:31.597 "name": "BaseBdev2", 00:11:31.597 "uuid": "d461989a-180a-478a-aefa-7d5b72801aac", 00:11:31.597 "is_configured": true, 00:11:31.597 "data_offset": 2048, 00:11:31.597 "data_size": 63488 00:11:31.597 } 00:11:31.597 ] 00:11:31.597 } 00:11:31.597 } 00:11:31.597 }' 00:11:31.597 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:31.597 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:31.597 BaseBdev2' 00:11:31.597 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:31.597 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:31.597 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:31.855 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:31.855 "name": "BaseBdev1", 00:11:31.855 "aliases": [ 00:11:31.855 "4f213aed-cf16-4b56-893a-4392f99f631a" 00:11:31.855 ], 00:11:31.855 "product_name": "Malloc disk", 00:11:31.855 "block_size": 512, 00:11:31.855 "num_blocks": 65536, 00:11:31.855 "uuid": "4f213aed-cf16-4b56-893a-4392f99f631a", 00:11:31.855 "assigned_rate_limits": { 00:11:31.855 "rw_ios_per_sec": 0, 00:11:31.855 "rw_mbytes_per_sec": 0, 00:11:31.855 "r_mbytes_per_sec": 0, 00:11:31.855 "w_mbytes_per_sec": 0 00:11:31.855 }, 00:11:31.855 "claimed": true, 00:11:31.855 "claim_type": "exclusive_write", 00:11:31.855 "zoned": false, 00:11:31.855 "supported_io_types": { 00:11:31.855 "read": true, 00:11:31.855 "write": true, 00:11:31.855 "unmap": true, 00:11:31.855 "flush": true, 00:11:31.855 "reset": true, 00:11:31.855 "nvme_admin": false, 00:11:31.855 "nvme_io": false, 00:11:31.855 "nvme_io_md": false, 00:11:31.855 "write_zeroes": true, 00:11:31.855 "zcopy": true, 00:11:31.855 "get_zone_info": false, 00:11:31.855 "zone_management": false, 00:11:31.855 "zone_append": false, 00:11:31.855 "compare": false, 00:11:31.855 "compare_and_write": false, 00:11:31.855 "abort": true, 00:11:31.855 "seek_hole": false, 00:11:31.855 "seek_data": false, 00:11:31.855 "copy": true, 00:11:31.855 "nvme_iov_md": false 00:11:31.855 }, 00:11:31.855 "memory_domains": [ 00:11:31.855 { 00:11:31.855 "dma_device_id": "system", 00:11:31.855 "dma_device_type": 1 00:11:31.855 }, 00:11:31.855 { 00:11:31.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.855 "dma_device_type": 2 00:11:31.855 } 00:11:31.855 ], 00:11:31.855 "driver_specific": {} 00:11:31.855 }' 00:11:31.855 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:31.855 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:31.855 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:31.855 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:31.855 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:32.113 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:32.113 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:32.113 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:32.113 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:32.113 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:32.113 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:32.113 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:32.113 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:32.113 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:32.113 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:32.371 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:32.371 "name": "BaseBdev2", 00:11:32.371 "aliases": [ 00:11:32.371 "d461989a-180a-478a-aefa-7d5b72801aac" 00:11:32.371 ], 00:11:32.371 "product_name": "Malloc disk", 00:11:32.371 "block_size": 512, 00:11:32.371 "num_blocks": 65536, 00:11:32.371 "uuid": "d461989a-180a-478a-aefa-7d5b72801aac", 00:11:32.371 "assigned_rate_limits": { 00:11:32.371 "rw_ios_per_sec": 0, 00:11:32.371 "rw_mbytes_per_sec": 0, 00:11:32.371 "r_mbytes_per_sec": 0, 00:11:32.371 "w_mbytes_per_sec": 0 00:11:32.371 }, 00:11:32.371 "claimed": true, 00:11:32.371 "claim_type": "exclusive_write", 00:11:32.371 "zoned": false, 00:11:32.371 "supported_io_types": { 00:11:32.371 "read": true, 00:11:32.371 "write": true, 00:11:32.371 "unmap": true, 00:11:32.371 "flush": true, 00:11:32.371 "reset": true, 00:11:32.371 "nvme_admin": false, 00:11:32.371 "nvme_io": false, 00:11:32.371 "nvme_io_md": false, 00:11:32.371 "write_zeroes": true, 00:11:32.371 "zcopy": true, 00:11:32.371 "get_zone_info": false, 00:11:32.371 "zone_management": false, 00:11:32.371 "zone_append": false, 00:11:32.371 "compare": false, 00:11:32.371 "compare_and_write": false, 00:11:32.371 "abort": true, 00:11:32.371 "seek_hole": false, 00:11:32.371 "seek_data": false, 00:11:32.371 "copy": true, 00:11:32.371 "nvme_iov_md": false 00:11:32.371 }, 00:11:32.371 "memory_domains": [ 00:11:32.371 { 00:11:32.371 "dma_device_id": "system", 00:11:32.371 "dma_device_type": 1 00:11:32.371 }, 00:11:32.371 { 00:11:32.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.372 "dma_device_type": 2 00:11:32.372 } 00:11:32.372 ], 00:11:32.372 "driver_specific": {} 00:11:32.372 }' 00:11:32.372 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:32.372 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:32.629 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:32.629 13:37:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:32.629 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:32.629 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:32.629 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:32.629 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:32.629 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:32.629 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:32.629 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:32.888 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:32.888 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:32.888 [2024-07-12 13:37:21.453585] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:32.888 [2024-07-12 13:37:21.453615] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:32.888 [2024-07-12 13:37:21.453656] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:33.146 "name": "Existed_Raid", 00:11:33.146 "uuid": "302dcdac-a85a-4b36-b00a-abbfdc128774", 00:11:33.146 "strip_size_kb": 64, 00:11:33.146 "state": "offline", 00:11:33.146 "raid_level": "concat", 00:11:33.146 "superblock": true, 00:11:33.146 "num_base_bdevs": 2, 00:11:33.146 "num_base_bdevs_discovered": 1, 00:11:33.146 "num_base_bdevs_operational": 1, 00:11:33.146 "base_bdevs_list": [ 00:11:33.146 { 00:11:33.146 "name": null, 00:11:33.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.146 "is_configured": false, 00:11:33.146 "data_offset": 2048, 00:11:33.146 "data_size": 63488 00:11:33.146 }, 00:11:33.146 { 00:11:33.146 "name": "BaseBdev2", 00:11:33.146 "uuid": "d461989a-180a-478a-aefa-7d5b72801aac", 00:11:33.146 "is_configured": true, 00:11:33.146 "data_offset": 2048, 00:11:33.146 "data_size": 63488 00:11:33.146 } 00:11:33.146 ] 00:11:33.146 }' 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:33.146 13:37:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:33.712 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:33.712 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:33.712 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.712 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:33.981 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:33.981 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:33.981 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:34.242 [2024-07-12 13:37:22.730053] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:34.242 [2024-07-12 13:37:22.730107] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20d9a10 name Existed_Raid, state offline 00:11:34.242 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:34.242 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:34.242 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:34.242 13:37:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 434300 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 434300 ']' 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 434300 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 434300 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 434300' 00:11:34.502 killing process with pid 434300 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 434300 00:11:34.502 [2024-07-12 13:37:23.059524] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:34.502 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 434300 00:11:34.502 [2024-07-12 13:37:23.060407] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:34.760 13:37:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:34.760 00:11:34.760 real 0m10.727s 00:11:34.760 user 0m19.088s 00:11:34.760 sys 0m1.971s 00:11:34.760 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:34.760 13:37:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:34.760 ************************************ 00:11:34.760 END TEST raid_state_function_test_sb 00:11:34.760 ************************************ 00:11:34.760 13:37:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:34.760 13:37:23 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:11:34.760 13:37:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:34.760 13:37:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:34.760 13:37:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:35.019 ************************************ 00:11:35.019 START TEST raid_superblock_test 00:11:35.019 ************************************ 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=435932 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 435932 /var/tmp/spdk-raid.sock 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 435932 ']' 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:35.019 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:35.019 13:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.019 [2024-07-12 13:37:23.457637] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:11:35.019 [2024-07-12 13:37:23.457772] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid435932 ] 00:11:35.277 [2024-07-12 13:37:23.650202] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:35.277 [2024-07-12 13:37:23.750181] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:35.277 [2024-07-12 13:37:23.818833] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:35.277 [2024-07-12 13:37:23.818877] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:35.843 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:36.101 malloc1 00:11:36.359 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:36.359 [2024-07-12 13:37:24.926761] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:36.359 [2024-07-12 13:37:24.926809] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:36.359 [2024-07-12 13:37:24.926830] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x215be90 00:11:36.359 [2024-07-12 13:37:24.926842] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:36.359 [2024-07-12 13:37:24.928461] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:36.359 [2024-07-12 13:37:24.928489] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:36.359 pt1 00:11:36.618 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:36.618 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:36.618 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:36.618 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:36.618 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:36.618 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:36.618 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:36.618 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:36.618 13:37:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:36.618 malloc2 00:11:36.877 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:36.877 [2024-07-12 13:37:25.444825] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:36.877 [2024-07-12 13:37:25.444870] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:36.877 [2024-07-12 13:37:25.444887] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21f9fb0 00:11:36.877 [2024-07-12 13:37:25.444900] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:36.877 [2024-07-12 13:37:25.446336] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:36.877 [2024-07-12 13:37:25.446364] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:36.877 pt2 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:11:37.136 [2024-07-12 13:37:25.689495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:37.136 [2024-07-12 13:37:25.690866] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:37.136 [2024-07-12 13:37:25.691020] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21fa6b0 00:11:37.136 [2024-07-12 13:37:25.691033] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:37.136 [2024-07-12 13:37:25.691233] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x215d220 00:11:37.136 [2024-07-12 13:37:25.691370] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21fa6b0 00:11:37.136 [2024-07-12 13:37:25.691380] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21fa6b0 00:11:37.136 [2024-07-12 13:37:25.691477] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.136 13:37:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:37.703 13:37:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.703 "name": "raid_bdev1", 00:11:37.703 "uuid": "c2269b21-65b4-4e6c-94ec-fd8dbc6fde09", 00:11:37.703 "strip_size_kb": 64, 00:11:37.703 "state": "online", 00:11:37.703 "raid_level": "concat", 00:11:37.703 "superblock": true, 00:11:37.703 "num_base_bdevs": 2, 00:11:37.703 "num_base_bdevs_discovered": 2, 00:11:37.703 "num_base_bdevs_operational": 2, 00:11:37.703 "base_bdevs_list": [ 00:11:37.703 { 00:11:37.703 "name": "pt1", 00:11:37.703 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:37.703 "is_configured": true, 00:11:37.703 "data_offset": 2048, 00:11:37.703 "data_size": 63488 00:11:37.703 }, 00:11:37.703 { 00:11:37.703 "name": "pt2", 00:11:37.703 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:37.703 "is_configured": true, 00:11:37.703 "data_offset": 2048, 00:11:37.703 "data_size": 63488 00:11:37.703 } 00:11:37.703 ] 00:11:37.703 }' 00:11:37.703 13:37:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.703 13:37:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.639 13:37:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:38.639 13:37:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:38.639 13:37:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:38.639 13:37:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:38.639 13:37:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:38.639 13:37:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:38.639 13:37:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:38.639 13:37:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:38.639 [2024-07-12 13:37:27.101465] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:38.639 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:38.639 "name": "raid_bdev1", 00:11:38.639 "aliases": [ 00:11:38.639 "c2269b21-65b4-4e6c-94ec-fd8dbc6fde09" 00:11:38.639 ], 00:11:38.639 "product_name": "Raid Volume", 00:11:38.639 "block_size": 512, 00:11:38.639 "num_blocks": 126976, 00:11:38.639 "uuid": "c2269b21-65b4-4e6c-94ec-fd8dbc6fde09", 00:11:38.639 "assigned_rate_limits": { 00:11:38.639 "rw_ios_per_sec": 0, 00:11:38.639 "rw_mbytes_per_sec": 0, 00:11:38.639 "r_mbytes_per_sec": 0, 00:11:38.639 "w_mbytes_per_sec": 0 00:11:38.639 }, 00:11:38.639 "claimed": false, 00:11:38.639 "zoned": false, 00:11:38.639 "supported_io_types": { 00:11:38.639 "read": true, 00:11:38.639 "write": true, 00:11:38.639 "unmap": true, 00:11:38.639 "flush": true, 00:11:38.639 "reset": true, 00:11:38.639 "nvme_admin": false, 00:11:38.639 "nvme_io": false, 00:11:38.639 "nvme_io_md": false, 00:11:38.639 "write_zeroes": true, 00:11:38.639 "zcopy": false, 00:11:38.639 "get_zone_info": false, 00:11:38.639 "zone_management": false, 00:11:38.639 "zone_append": false, 00:11:38.639 "compare": false, 00:11:38.639 "compare_and_write": false, 00:11:38.639 "abort": false, 00:11:38.639 "seek_hole": false, 00:11:38.639 "seek_data": false, 00:11:38.639 "copy": false, 00:11:38.639 "nvme_iov_md": false 00:11:38.639 }, 00:11:38.639 "memory_domains": [ 00:11:38.639 { 00:11:38.639 "dma_device_id": "system", 00:11:38.639 "dma_device_type": 1 00:11:38.639 }, 00:11:38.639 { 00:11:38.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.639 "dma_device_type": 2 00:11:38.639 }, 00:11:38.639 { 00:11:38.639 "dma_device_id": "system", 00:11:38.639 "dma_device_type": 1 00:11:38.639 }, 00:11:38.639 { 00:11:38.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.639 "dma_device_type": 2 00:11:38.639 } 00:11:38.639 ], 00:11:38.639 "driver_specific": { 00:11:38.639 "raid": { 00:11:38.639 "uuid": "c2269b21-65b4-4e6c-94ec-fd8dbc6fde09", 00:11:38.639 "strip_size_kb": 64, 00:11:38.639 "state": "online", 00:11:38.639 "raid_level": "concat", 00:11:38.639 "superblock": true, 00:11:38.639 "num_base_bdevs": 2, 00:11:38.639 "num_base_bdevs_discovered": 2, 00:11:38.639 "num_base_bdevs_operational": 2, 00:11:38.639 "base_bdevs_list": [ 00:11:38.639 { 00:11:38.639 "name": "pt1", 00:11:38.639 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:38.639 "is_configured": true, 00:11:38.639 "data_offset": 2048, 00:11:38.639 "data_size": 63488 00:11:38.639 }, 00:11:38.639 { 00:11:38.639 "name": "pt2", 00:11:38.639 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:38.639 "is_configured": true, 00:11:38.639 "data_offset": 2048, 00:11:38.639 "data_size": 63488 00:11:38.640 } 00:11:38.640 ] 00:11:38.640 } 00:11:38.640 } 00:11:38.640 }' 00:11:38.640 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:38.640 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:38.640 pt2' 00:11:38.640 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:38.640 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:38.640 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:39.208 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:39.208 "name": "pt1", 00:11:39.208 "aliases": [ 00:11:39.208 "00000000-0000-0000-0000-000000000001" 00:11:39.208 ], 00:11:39.208 "product_name": "passthru", 00:11:39.208 "block_size": 512, 00:11:39.208 "num_blocks": 65536, 00:11:39.208 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:39.208 "assigned_rate_limits": { 00:11:39.208 "rw_ios_per_sec": 0, 00:11:39.208 "rw_mbytes_per_sec": 0, 00:11:39.208 "r_mbytes_per_sec": 0, 00:11:39.208 "w_mbytes_per_sec": 0 00:11:39.208 }, 00:11:39.208 "claimed": true, 00:11:39.208 "claim_type": "exclusive_write", 00:11:39.208 "zoned": false, 00:11:39.208 "supported_io_types": { 00:11:39.208 "read": true, 00:11:39.208 "write": true, 00:11:39.208 "unmap": true, 00:11:39.208 "flush": true, 00:11:39.208 "reset": true, 00:11:39.208 "nvme_admin": false, 00:11:39.208 "nvme_io": false, 00:11:39.208 "nvme_io_md": false, 00:11:39.208 "write_zeroes": true, 00:11:39.208 "zcopy": true, 00:11:39.208 "get_zone_info": false, 00:11:39.208 "zone_management": false, 00:11:39.208 "zone_append": false, 00:11:39.208 "compare": false, 00:11:39.208 "compare_and_write": false, 00:11:39.208 "abort": true, 00:11:39.208 "seek_hole": false, 00:11:39.208 "seek_data": false, 00:11:39.208 "copy": true, 00:11:39.208 "nvme_iov_md": false 00:11:39.208 }, 00:11:39.208 "memory_domains": [ 00:11:39.208 { 00:11:39.208 "dma_device_id": "system", 00:11:39.208 "dma_device_type": 1 00:11:39.208 }, 00:11:39.208 { 00:11:39.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.208 "dma_device_type": 2 00:11:39.208 } 00:11:39.208 ], 00:11:39.208 "driver_specific": { 00:11:39.208 "passthru": { 00:11:39.208 "name": "pt1", 00:11:39.208 "base_bdev_name": "malloc1" 00:11:39.208 } 00:11:39.208 } 00:11:39.208 }' 00:11:39.208 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.208 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.467 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:39.467 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.467 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.467 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:39.467 13:37:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.467 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.726 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:39.726 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.726 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.726 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:39.726 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:39.726 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:39.726 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:39.985 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:39.985 "name": "pt2", 00:11:39.985 "aliases": [ 00:11:39.985 "00000000-0000-0000-0000-000000000002" 00:11:39.985 ], 00:11:39.985 "product_name": "passthru", 00:11:39.985 "block_size": 512, 00:11:39.985 "num_blocks": 65536, 00:11:39.985 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:39.985 "assigned_rate_limits": { 00:11:39.985 "rw_ios_per_sec": 0, 00:11:39.985 "rw_mbytes_per_sec": 0, 00:11:39.985 "r_mbytes_per_sec": 0, 00:11:39.985 "w_mbytes_per_sec": 0 00:11:39.985 }, 00:11:39.985 "claimed": true, 00:11:39.985 "claim_type": "exclusive_write", 00:11:39.985 "zoned": false, 00:11:39.985 "supported_io_types": { 00:11:39.985 "read": true, 00:11:39.985 "write": true, 00:11:39.985 "unmap": true, 00:11:39.985 "flush": true, 00:11:39.985 "reset": true, 00:11:39.985 "nvme_admin": false, 00:11:39.985 "nvme_io": false, 00:11:39.985 "nvme_io_md": false, 00:11:39.985 "write_zeroes": true, 00:11:39.985 "zcopy": true, 00:11:39.985 "get_zone_info": false, 00:11:39.985 "zone_management": false, 00:11:39.985 "zone_append": false, 00:11:39.985 "compare": false, 00:11:39.985 "compare_and_write": false, 00:11:39.985 "abort": true, 00:11:39.985 "seek_hole": false, 00:11:39.985 "seek_data": false, 00:11:39.985 "copy": true, 00:11:39.985 "nvme_iov_md": false 00:11:39.985 }, 00:11:39.985 "memory_domains": [ 00:11:39.985 { 00:11:39.985 "dma_device_id": "system", 00:11:39.985 "dma_device_type": 1 00:11:39.985 }, 00:11:39.985 { 00:11:39.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.985 "dma_device_type": 2 00:11:39.985 } 00:11:39.985 ], 00:11:39.985 "driver_specific": { 00:11:39.985 "passthru": { 00:11:39.985 "name": "pt2", 00:11:39.985 "base_bdev_name": "malloc2" 00:11:39.985 } 00:11:39.985 } 00:11:39.985 }' 00:11:39.985 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:40.244 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:40.244 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:40.244 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:40.244 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:40.244 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:40.244 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:40.244 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:40.504 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:40.504 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:40.504 13:37:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:40.504 13:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:40.504 13:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:40.504 13:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:41.072 [2024-07-12 13:37:29.491821] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:41.072 13:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c2269b21-65b4-4e6c-94ec-fd8dbc6fde09 00:11:41.072 13:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c2269b21-65b4-4e6c-94ec-fd8dbc6fde09 ']' 00:11:41.072 13:37:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:41.640 [2024-07-12 13:37:30.004916] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:41.640 [2024-07-12 13:37:30.004948] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:41.640 [2024-07-12 13:37:30.005008] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:41.640 [2024-07-12 13:37:30.005051] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:41.640 [2024-07-12 13:37:30.005063] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21fa6b0 name raid_bdev1, state offline 00:11:41.640 13:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.640 13:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:41.899 13:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:41.899 13:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:41.899 13:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:41.899 13:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:41.899 13:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:41.899 13:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:42.158 13:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:42.158 13:37:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:42.727 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:43.296 [2024-07-12 13:37:31.725387] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:43.296 [2024-07-12 13:37:31.726779] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:43.296 [2024-07-12 13:37:31.726831] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:43.296 [2024-07-12 13:37:31.726872] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:43.296 [2024-07-12 13:37:31.726891] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:43.296 [2024-07-12 13:37:31.726900] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x215e440 name raid_bdev1, state configuring 00:11:43.296 request: 00:11:43.296 { 00:11:43.296 "name": "raid_bdev1", 00:11:43.296 "raid_level": "concat", 00:11:43.296 "base_bdevs": [ 00:11:43.296 "malloc1", 00:11:43.296 "malloc2" 00:11:43.296 ], 00:11:43.296 "strip_size_kb": 64, 00:11:43.296 "superblock": false, 00:11:43.296 "method": "bdev_raid_create", 00:11:43.296 "req_id": 1 00:11:43.296 } 00:11:43.296 Got JSON-RPC error response 00:11:43.296 response: 00:11:43.296 { 00:11:43.296 "code": -17, 00:11:43.296 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:43.296 } 00:11:43.296 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:43.296 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:43.296 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:43.296 13:37:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:43.296 13:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.297 13:37:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:43.865 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:43.865 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:43.865 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:44.125 [2024-07-12 13:37:32.503368] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:44.125 [2024-07-12 13:37:32.503420] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:44.125 [2024-07-12 13:37:32.503439] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x215f040 00:11:44.125 [2024-07-12 13:37:32.503452] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:44.125 [2024-07-12 13:37:32.505120] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:44.125 [2024-07-12 13:37:32.505149] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:44.125 [2024-07-12 13:37:32.505217] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:44.125 [2024-07-12 13:37:32.505242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:44.125 pt1 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.125 13:37:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:44.694 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.694 "name": "raid_bdev1", 00:11:44.694 "uuid": "c2269b21-65b4-4e6c-94ec-fd8dbc6fde09", 00:11:44.694 "strip_size_kb": 64, 00:11:44.694 "state": "configuring", 00:11:44.694 "raid_level": "concat", 00:11:44.694 "superblock": true, 00:11:44.694 "num_base_bdevs": 2, 00:11:44.694 "num_base_bdevs_discovered": 1, 00:11:44.694 "num_base_bdevs_operational": 2, 00:11:44.694 "base_bdevs_list": [ 00:11:44.694 { 00:11:44.694 "name": "pt1", 00:11:44.694 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:44.694 "is_configured": true, 00:11:44.694 "data_offset": 2048, 00:11:44.694 "data_size": 63488 00:11:44.694 }, 00:11:44.694 { 00:11:44.694 "name": null, 00:11:44.694 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:44.694 "is_configured": false, 00:11:44.694 "data_offset": 2048, 00:11:44.694 "data_size": 63488 00:11:44.694 } 00:11:44.694 ] 00:11:44.694 }' 00:11:44.694 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.694 13:37:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.261 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:45.261 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:45.261 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:45.261 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:45.520 [2024-07-12 13:37:33.867001] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:45.520 [2024-07-12 13:37:33.867050] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:45.520 [2024-07-12 13:37:33.867068] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x215cab0 00:11:45.520 [2024-07-12 13:37:33.867081] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:45.520 [2024-07-12 13:37:33.867417] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:45.520 [2024-07-12 13:37:33.867434] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:45.520 [2024-07-12 13:37:33.867495] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:45.520 [2024-07-12 13:37:33.867514] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:45.520 [2024-07-12 13:37:33.867606] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21fc970 00:11:45.520 [2024-07-12 13:37:33.867617] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:45.520 [2024-07-12 13:37:33.867782] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21fa580 00:11:45.520 [2024-07-12 13:37:33.867898] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21fc970 00:11:45.520 [2024-07-12 13:37:33.867908] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21fc970 00:11:45.520 [2024-07-12 13:37:33.868029] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:45.520 pt2 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.520 13:37:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:45.779 13:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:45.779 "name": "raid_bdev1", 00:11:45.779 "uuid": "c2269b21-65b4-4e6c-94ec-fd8dbc6fde09", 00:11:45.779 "strip_size_kb": 64, 00:11:45.779 "state": "online", 00:11:45.779 "raid_level": "concat", 00:11:45.779 "superblock": true, 00:11:45.779 "num_base_bdevs": 2, 00:11:45.779 "num_base_bdevs_discovered": 2, 00:11:45.779 "num_base_bdevs_operational": 2, 00:11:45.779 "base_bdevs_list": [ 00:11:45.779 { 00:11:45.779 "name": "pt1", 00:11:45.779 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:45.779 "is_configured": true, 00:11:45.779 "data_offset": 2048, 00:11:45.779 "data_size": 63488 00:11:45.779 }, 00:11:45.779 { 00:11:45.779 "name": "pt2", 00:11:45.779 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:45.779 "is_configured": true, 00:11:45.779 "data_offset": 2048, 00:11:45.779 "data_size": 63488 00:11:45.779 } 00:11:45.779 ] 00:11:45.779 }' 00:11:45.779 13:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:45.779 13:37:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:46.348 13:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:46.349 13:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:46.349 13:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:46.349 13:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:46.349 13:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:46.349 13:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:46.349 13:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:46.349 13:37:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:46.607 [2024-07-12 13:37:34.990232] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:46.607 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:46.607 "name": "raid_bdev1", 00:11:46.607 "aliases": [ 00:11:46.607 "c2269b21-65b4-4e6c-94ec-fd8dbc6fde09" 00:11:46.607 ], 00:11:46.607 "product_name": "Raid Volume", 00:11:46.607 "block_size": 512, 00:11:46.608 "num_blocks": 126976, 00:11:46.608 "uuid": "c2269b21-65b4-4e6c-94ec-fd8dbc6fde09", 00:11:46.608 "assigned_rate_limits": { 00:11:46.608 "rw_ios_per_sec": 0, 00:11:46.608 "rw_mbytes_per_sec": 0, 00:11:46.608 "r_mbytes_per_sec": 0, 00:11:46.608 "w_mbytes_per_sec": 0 00:11:46.608 }, 00:11:46.608 "claimed": false, 00:11:46.608 "zoned": false, 00:11:46.608 "supported_io_types": { 00:11:46.608 "read": true, 00:11:46.608 "write": true, 00:11:46.608 "unmap": true, 00:11:46.608 "flush": true, 00:11:46.608 "reset": true, 00:11:46.608 "nvme_admin": false, 00:11:46.608 "nvme_io": false, 00:11:46.608 "nvme_io_md": false, 00:11:46.608 "write_zeroes": true, 00:11:46.608 "zcopy": false, 00:11:46.608 "get_zone_info": false, 00:11:46.608 "zone_management": false, 00:11:46.608 "zone_append": false, 00:11:46.608 "compare": false, 00:11:46.608 "compare_and_write": false, 00:11:46.608 "abort": false, 00:11:46.608 "seek_hole": false, 00:11:46.608 "seek_data": false, 00:11:46.608 "copy": false, 00:11:46.608 "nvme_iov_md": false 00:11:46.608 }, 00:11:46.608 "memory_domains": [ 00:11:46.608 { 00:11:46.608 "dma_device_id": "system", 00:11:46.608 "dma_device_type": 1 00:11:46.608 }, 00:11:46.608 { 00:11:46.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.608 "dma_device_type": 2 00:11:46.608 }, 00:11:46.608 { 00:11:46.608 "dma_device_id": "system", 00:11:46.608 "dma_device_type": 1 00:11:46.608 }, 00:11:46.608 { 00:11:46.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.608 "dma_device_type": 2 00:11:46.608 } 00:11:46.608 ], 00:11:46.608 "driver_specific": { 00:11:46.608 "raid": { 00:11:46.608 "uuid": "c2269b21-65b4-4e6c-94ec-fd8dbc6fde09", 00:11:46.608 "strip_size_kb": 64, 00:11:46.608 "state": "online", 00:11:46.608 "raid_level": "concat", 00:11:46.608 "superblock": true, 00:11:46.608 "num_base_bdevs": 2, 00:11:46.608 "num_base_bdevs_discovered": 2, 00:11:46.608 "num_base_bdevs_operational": 2, 00:11:46.608 "base_bdevs_list": [ 00:11:46.608 { 00:11:46.608 "name": "pt1", 00:11:46.608 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:46.608 "is_configured": true, 00:11:46.608 "data_offset": 2048, 00:11:46.608 "data_size": 63488 00:11:46.608 }, 00:11:46.608 { 00:11:46.608 "name": "pt2", 00:11:46.608 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:46.608 "is_configured": true, 00:11:46.608 "data_offset": 2048, 00:11:46.608 "data_size": 63488 00:11:46.608 } 00:11:46.608 ] 00:11:46.608 } 00:11:46.608 } 00:11:46.608 }' 00:11:46.608 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:46.608 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:46.608 pt2' 00:11:46.608 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:46.608 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:46.608 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:46.867 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:46.867 "name": "pt1", 00:11:46.867 "aliases": [ 00:11:46.867 "00000000-0000-0000-0000-000000000001" 00:11:46.867 ], 00:11:46.867 "product_name": "passthru", 00:11:46.867 "block_size": 512, 00:11:46.867 "num_blocks": 65536, 00:11:46.867 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:46.867 "assigned_rate_limits": { 00:11:46.867 "rw_ios_per_sec": 0, 00:11:46.867 "rw_mbytes_per_sec": 0, 00:11:46.867 "r_mbytes_per_sec": 0, 00:11:46.867 "w_mbytes_per_sec": 0 00:11:46.867 }, 00:11:46.867 "claimed": true, 00:11:46.867 "claim_type": "exclusive_write", 00:11:46.867 "zoned": false, 00:11:46.867 "supported_io_types": { 00:11:46.867 "read": true, 00:11:46.867 "write": true, 00:11:46.867 "unmap": true, 00:11:46.867 "flush": true, 00:11:46.867 "reset": true, 00:11:46.867 "nvme_admin": false, 00:11:46.867 "nvme_io": false, 00:11:46.867 "nvme_io_md": false, 00:11:46.867 "write_zeroes": true, 00:11:46.867 "zcopy": true, 00:11:46.867 "get_zone_info": false, 00:11:46.867 "zone_management": false, 00:11:46.867 "zone_append": false, 00:11:46.867 "compare": false, 00:11:46.867 "compare_and_write": false, 00:11:46.867 "abort": true, 00:11:46.867 "seek_hole": false, 00:11:46.867 "seek_data": false, 00:11:46.867 "copy": true, 00:11:46.867 "nvme_iov_md": false 00:11:46.867 }, 00:11:46.867 "memory_domains": [ 00:11:46.867 { 00:11:46.867 "dma_device_id": "system", 00:11:46.867 "dma_device_type": 1 00:11:46.867 }, 00:11:46.867 { 00:11:46.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.867 "dma_device_type": 2 00:11:46.867 } 00:11:46.867 ], 00:11:46.867 "driver_specific": { 00:11:46.867 "passthru": { 00:11:46.867 "name": "pt1", 00:11:46.867 "base_bdev_name": "malloc1" 00:11:46.867 } 00:11:46.867 } 00:11:46.867 }' 00:11:46.867 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:46.867 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:46.867 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:46.867 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:46.867 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:47.126 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:47.126 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:47.126 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:47.126 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:47.126 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:47.126 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:47.126 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:47.126 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:47.126 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:47.126 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:47.386 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:47.386 "name": "pt2", 00:11:47.386 "aliases": [ 00:11:47.386 "00000000-0000-0000-0000-000000000002" 00:11:47.386 ], 00:11:47.386 "product_name": "passthru", 00:11:47.386 "block_size": 512, 00:11:47.386 "num_blocks": 65536, 00:11:47.386 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:47.386 "assigned_rate_limits": { 00:11:47.386 "rw_ios_per_sec": 0, 00:11:47.386 "rw_mbytes_per_sec": 0, 00:11:47.386 "r_mbytes_per_sec": 0, 00:11:47.386 "w_mbytes_per_sec": 0 00:11:47.386 }, 00:11:47.386 "claimed": true, 00:11:47.386 "claim_type": "exclusive_write", 00:11:47.386 "zoned": false, 00:11:47.386 "supported_io_types": { 00:11:47.386 "read": true, 00:11:47.386 "write": true, 00:11:47.386 "unmap": true, 00:11:47.386 "flush": true, 00:11:47.386 "reset": true, 00:11:47.386 "nvme_admin": false, 00:11:47.386 "nvme_io": false, 00:11:47.386 "nvme_io_md": false, 00:11:47.386 "write_zeroes": true, 00:11:47.386 "zcopy": true, 00:11:47.386 "get_zone_info": false, 00:11:47.386 "zone_management": false, 00:11:47.386 "zone_append": false, 00:11:47.386 "compare": false, 00:11:47.386 "compare_and_write": false, 00:11:47.386 "abort": true, 00:11:47.386 "seek_hole": false, 00:11:47.386 "seek_data": false, 00:11:47.386 "copy": true, 00:11:47.386 "nvme_iov_md": false 00:11:47.386 }, 00:11:47.386 "memory_domains": [ 00:11:47.386 { 00:11:47.386 "dma_device_id": "system", 00:11:47.386 "dma_device_type": 1 00:11:47.386 }, 00:11:47.386 { 00:11:47.386 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.386 "dma_device_type": 2 00:11:47.386 } 00:11:47.386 ], 00:11:47.386 "driver_specific": { 00:11:47.386 "passthru": { 00:11:47.386 "name": "pt2", 00:11:47.386 "base_bdev_name": "malloc2" 00:11:47.386 } 00:11:47.386 } 00:11:47.386 }' 00:11:47.386 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:47.386 13:37:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:47.645 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:47.645 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:47.645 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:47.645 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:47.645 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:47.645 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:47.645 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:47.645 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:47.645 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:47.904 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:47.904 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:47.904 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:47.904 [2024-07-12 13:37:36.486234] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c2269b21-65b4-4e6c-94ec-fd8dbc6fde09 '!=' c2269b21-65b4-4e6c-94ec-fd8dbc6fde09 ']' 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 435932 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 435932 ']' 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 435932 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 435932 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 435932' 00:11:48.164 killing process with pid 435932 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 435932 00:11:48.164 [2024-07-12 13:37:36.554717] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:48.164 [2024-07-12 13:37:36.554772] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:48.164 [2024-07-12 13:37:36.554821] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:48.164 [2024-07-12 13:37:36.554833] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21fc970 name raid_bdev1, state offline 00:11:48.164 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 435932 00:11:48.164 [2024-07-12 13:37:36.571000] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:48.424 13:37:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:48.424 00:11:48.424 real 0m13.437s 00:11:48.424 user 0m24.118s 00:11:48.424 sys 0m2.419s 00:11:48.424 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:48.424 13:37:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:48.424 ************************************ 00:11:48.424 END TEST raid_superblock_test 00:11:48.424 ************************************ 00:11:48.424 13:37:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:48.424 13:37:36 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:11:48.424 13:37:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:48.424 13:37:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:48.424 13:37:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:48.424 ************************************ 00:11:48.424 START TEST raid_read_error_test 00:11:48.424 ************************************ 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.FPk3oIPzev 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=437900 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 437900 /var/tmp/spdk-raid.sock 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 437900 ']' 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:48.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:48.424 13:37:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:48.424 [2024-07-12 13:37:36.940996] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:11:48.424 [2024-07-12 13:37:36.941063] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid437900 ] 00:11:48.684 [2024-07-12 13:37:37.071080] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:48.684 [2024-07-12 13:37:37.177056] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:48.684 [2024-07-12 13:37:37.246604] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:48.684 [2024-07-12 13:37:37.246643] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:49.622 13:37:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:49.622 13:37:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:49.622 13:37:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:49.622 13:37:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:49.622 BaseBdev1_malloc 00:11:49.622 13:37:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:49.882 true 00:11:49.882 13:37:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:50.141 [2024-07-12 13:37:38.582023] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:50.141 [2024-07-12 13:37:38.582069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:50.141 [2024-07-12 13:37:38.582091] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x153aa10 00:11:50.141 [2024-07-12 13:37:38.582104] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:50.141 [2024-07-12 13:37:38.584082] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:50.141 [2024-07-12 13:37:38.584113] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:50.141 BaseBdev1 00:11:50.141 13:37:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:50.141 13:37:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:50.401 BaseBdev2_malloc 00:11:50.401 13:37:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:50.660 true 00:11:50.660 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:50.920 [2024-07-12 13:37:39.309757] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:50.920 [2024-07-12 13:37:39.309801] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:50.920 [2024-07-12 13:37:39.309824] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x153f250 00:11:50.920 [2024-07-12 13:37:39.309845] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:50.920 [2024-07-12 13:37:39.311470] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:50.920 [2024-07-12 13:37:39.311500] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:50.920 BaseBdev2 00:11:50.920 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:51.179 [2024-07-12 13:37:39.554446] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:51.179 [2024-07-12 13:37:39.555851] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:51.179 [2024-07-12 13:37:39.556046] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1540c60 00:11:51.179 [2024-07-12 13:37:39.556060] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:51.179 [2024-07-12 13:37:39.556260] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1541bd0 00:11:51.179 [2024-07-12 13:37:39.556408] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1540c60 00:11:51.179 [2024-07-12 13:37:39.556420] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1540c60 00:11:51.179 [2024-07-12 13:37:39.556527] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:51.179 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:51.439 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:51.439 "name": "raid_bdev1", 00:11:51.439 "uuid": "cf5d1331-e99e-4833-8ab2-acc5b7ba994c", 00:11:51.439 "strip_size_kb": 64, 00:11:51.439 "state": "online", 00:11:51.439 "raid_level": "concat", 00:11:51.439 "superblock": true, 00:11:51.439 "num_base_bdevs": 2, 00:11:51.439 "num_base_bdevs_discovered": 2, 00:11:51.439 "num_base_bdevs_operational": 2, 00:11:51.439 "base_bdevs_list": [ 00:11:51.439 { 00:11:51.439 "name": "BaseBdev1", 00:11:51.439 "uuid": "f2798484-0672-5f0a-b46f-78903e292557", 00:11:51.439 "is_configured": true, 00:11:51.439 "data_offset": 2048, 00:11:51.439 "data_size": 63488 00:11:51.439 }, 00:11:51.439 { 00:11:51.439 "name": "BaseBdev2", 00:11:51.439 "uuid": "2b40eb59-de8d-5801-b9e7-a51dffbc5902", 00:11:51.439 "is_configured": true, 00:11:51.439 "data_offset": 2048, 00:11:51.439 "data_size": 63488 00:11:51.439 } 00:11:51.439 ] 00:11:51.439 }' 00:11:51.439 13:37:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:51.439 13:37:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:52.006 13:37:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:52.006 13:37:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:52.006 [2024-07-12 13:37:40.461138] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x153c2f0 00:11:52.944 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:53.203 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:53.203 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:53.203 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:53.203 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.204 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:53.463 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.463 "name": "raid_bdev1", 00:11:53.463 "uuid": "cf5d1331-e99e-4833-8ab2-acc5b7ba994c", 00:11:53.463 "strip_size_kb": 64, 00:11:53.463 "state": "online", 00:11:53.463 "raid_level": "concat", 00:11:53.463 "superblock": true, 00:11:53.463 "num_base_bdevs": 2, 00:11:53.463 "num_base_bdevs_discovered": 2, 00:11:53.463 "num_base_bdevs_operational": 2, 00:11:53.463 "base_bdevs_list": [ 00:11:53.463 { 00:11:53.463 "name": "BaseBdev1", 00:11:53.463 "uuid": "f2798484-0672-5f0a-b46f-78903e292557", 00:11:53.463 "is_configured": true, 00:11:53.463 "data_offset": 2048, 00:11:53.463 "data_size": 63488 00:11:53.463 }, 00:11:53.463 { 00:11:53.463 "name": "BaseBdev2", 00:11:53.463 "uuid": "2b40eb59-de8d-5801-b9e7-a51dffbc5902", 00:11:53.463 "is_configured": true, 00:11:53.463 "data_offset": 2048, 00:11:53.463 "data_size": 63488 00:11:53.463 } 00:11:53.463 ] 00:11:53.463 }' 00:11:53.463 13:37:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.463 13:37:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.031 13:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:54.290 [2024-07-12 13:37:42.625180] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:54.290 [2024-07-12 13:37:42.625214] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:54.290 [2024-07-12 13:37:42.628452] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:54.290 [2024-07-12 13:37:42.628484] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:54.290 [2024-07-12 13:37:42.628512] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:54.290 [2024-07-12 13:37:42.628524] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1540c60 name raid_bdev1, state offline 00:11:54.290 0 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 437900 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 437900 ']' 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 437900 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 437900 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 437900' 00:11:54.290 killing process with pid 437900 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 437900 00:11:54.290 [2024-07-12 13:37:42.708154] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:54.290 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 437900 00:11:54.290 [2024-07-12 13:37:42.718469] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:54.556 13:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.FPk3oIPzev 00:11:54.556 13:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:54.556 13:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:54.556 13:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:11:54.556 13:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:11:54.556 13:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:54.556 13:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:54.556 13:37:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:11:54.556 00:11:54.556 real 0m6.087s 00:11:54.556 user 0m9.490s 00:11:54.556 sys 0m1.051s 00:11:54.556 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:54.556 13:37:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.556 ************************************ 00:11:54.556 END TEST raid_read_error_test 00:11:54.556 ************************************ 00:11:54.556 13:37:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:54.556 13:37:42 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:11:54.556 13:37:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:54.556 13:37:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:54.556 13:37:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:54.556 ************************************ 00:11:54.556 START TEST raid_write_error_test 00:11:54.556 ************************************ 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.eqEDEsuiyA 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=438707 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 438707 /var/tmp/spdk-raid.sock 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 438707 ']' 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:54.556 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:54.556 13:37:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.556 [2024-07-12 13:37:43.109903] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:11:54.556 [2024-07-12 13:37:43.109965] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid438707 ] 00:11:54.818 [2024-07-12 13:37:43.224957] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:54.818 [2024-07-12 13:37:43.331545] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:54.818 [2024-07-12 13:37:43.399944] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:54.818 [2024-07-12 13:37:43.399981] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:55.754 13:37:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:55.754 13:37:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:55.754 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:55.754 13:37:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:55.754 BaseBdev1_malloc 00:11:55.754 13:37:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:55.754 true 00:11:56.013 13:37:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:56.013 [2024-07-12 13:37:44.491242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:56.013 [2024-07-12 13:37:44.491288] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:56.013 [2024-07-12 13:37:44.491306] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x189ea10 00:11:56.013 [2024-07-12 13:37:44.491319] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:56.013 [2024-07-12 13:37:44.493018] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:56.013 [2024-07-12 13:37:44.493046] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:56.013 BaseBdev1 00:11:56.013 13:37:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:56.013 13:37:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:56.272 BaseBdev2_malloc 00:11:56.272 13:37:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:56.531 true 00:11:56.531 13:37:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:56.531 [2024-07-12 13:37:45.109562] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:56.531 [2024-07-12 13:37:45.109608] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:56.531 [2024-07-12 13:37:45.109627] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18a3250 00:11:56.531 [2024-07-12 13:37:45.109640] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:56.531 [2024-07-12 13:37:45.111085] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:56.531 [2024-07-12 13:37:45.111113] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:56.790 BaseBdev2 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:56.790 [2024-07-12 13:37:45.282051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:56.790 [2024-07-12 13:37:45.283283] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:56.790 [2024-07-12 13:37:45.283463] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18a4c60 00:11:56.790 [2024-07-12 13:37:45.283476] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:56.790 [2024-07-12 13:37:45.283660] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a5bd0 00:11:56.790 [2024-07-12 13:37:45.283799] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18a4c60 00:11:56.790 [2024-07-12 13:37:45.283809] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18a4c60 00:11:56.790 [2024-07-12 13:37:45.283909] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.790 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:57.049 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.049 "name": "raid_bdev1", 00:11:57.049 "uuid": "e3083c64-d402-42cf-a5f8-d6e91d398bba", 00:11:57.049 "strip_size_kb": 64, 00:11:57.049 "state": "online", 00:11:57.049 "raid_level": "concat", 00:11:57.049 "superblock": true, 00:11:57.049 "num_base_bdevs": 2, 00:11:57.049 "num_base_bdevs_discovered": 2, 00:11:57.049 "num_base_bdevs_operational": 2, 00:11:57.049 "base_bdevs_list": [ 00:11:57.049 { 00:11:57.049 "name": "BaseBdev1", 00:11:57.049 "uuid": "4890fa67-cc81-53db-8eb4-ddeb788ce469", 00:11:57.049 "is_configured": true, 00:11:57.049 "data_offset": 2048, 00:11:57.049 "data_size": 63488 00:11:57.049 }, 00:11:57.049 { 00:11:57.049 "name": "BaseBdev2", 00:11:57.049 "uuid": "267d688d-fadf-5d9b-a145-b194ce00bd0a", 00:11:57.049 "is_configured": true, 00:11:57.049 "data_offset": 2048, 00:11:57.049 "data_size": 63488 00:11:57.049 } 00:11:57.049 ] 00:11:57.049 }' 00:11:57.049 13:37:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.049 13:37:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.616 13:37:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:57.616 13:37:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:57.875 [2024-07-12 13:37:46.268989] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a02f0 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.813 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:59.073 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:59.073 "name": "raid_bdev1", 00:11:59.073 "uuid": "e3083c64-d402-42cf-a5f8-d6e91d398bba", 00:11:59.073 "strip_size_kb": 64, 00:11:59.073 "state": "online", 00:11:59.073 "raid_level": "concat", 00:11:59.073 "superblock": true, 00:11:59.073 "num_base_bdevs": 2, 00:11:59.073 "num_base_bdevs_discovered": 2, 00:11:59.073 "num_base_bdevs_operational": 2, 00:11:59.073 "base_bdevs_list": [ 00:11:59.073 { 00:11:59.073 "name": "BaseBdev1", 00:11:59.073 "uuid": "4890fa67-cc81-53db-8eb4-ddeb788ce469", 00:11:59.073 "is_configured": true, 00:11:59.073 "data_offset": 2048, 00:11:59.073 "data_size": 63488 00:11:59.073 }, 00:11:59.073 { 00:11:59.073 "name": "BaseBdev2", 00:11:59.073 "uuid": "267d688d-fadf-5d9b-a145-b194ce00bd0a", 00:11:59.073 "is_configured": true, 00:11:59.073 "data_offset": 2048, 00:11:59.073 "data_size": 63488 00:11:59.073 } 00:11:59.073 ] 00:11:59.073 }' 00:11:59.073 13:37:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:59.073 13:37:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:59.642 13:37:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:59.903 [2024-07-12 13:37:48.369378] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:59.903 [2024-07-12 13:37:48.369429] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:59.903 [2024-07-12 13:37:48.372598] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:59.903 [2024-07-12 13:37:48.372630] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:59.903 [2024-07-12 13:37:48.372657] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:59.903 [2024-07-12 13:37:48.372668] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18a4c60 name raid_bdev1, state offline 00:11:59.903 0 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 438707 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 438707 ']' 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 438707 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 438707 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 438707' 00:11:59.903 killing process with pid 438707 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 438707 00:11:59.903 [2024-07-12 13:37:48.452395] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:59.903 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 438707 00:11:59.903 [2024-07-12 13:37:48.463346] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:00.164 13:37:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.eqEDEsuiyA 00:12:00.164 13:37:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:00.164 13:37:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:00.164 13:37:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.48 00:12:00.164 13:37:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:00.164 13:37:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:00.164 13:37:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:00.164 13:37:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.48 != \0\.\0\0 ]] 00:12:00.164 00:12:00.164 real 0m5.668s 00:12:00.164 user 0m8.676s 00:12:00.164 sys 0m1.029s 00:12:00.164 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:00.164 13:37:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:00.164 ************************************ 00:12:00.164 END TEST raid_write_error_test 00:12:00.164 ************************************ 00:12:00.424 13:37:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:00.424 13:37:48 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:00.424 13:37:48 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:12:00.424 13:37:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:00.424 13:37:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:00.424 13:37:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:00.424 ************************************ 00:12:00.424 START TEST raid_state_function_test 00:12:00.424 ************************************ 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=439610 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 439610' 00:12:00.424 Process raid pid: 439610 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 439610 /var/tmp/spdk-raid.sock 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 439610 ']' 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:00.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:00.424 13:37:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:00.424 [2024-07-12 13:37:48.864325] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:12:00.424 [2024-07-12 13:37:48.864396] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:00.424 [2024-07-12 13:37:48.997749] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:00.696 [2024-07-12 13:37:49.108520] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:00.696 [2024-07-12 13:37:49.167346] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:00.696 [2024-07-12 13:37:49.167382] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:01.296 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:01.296 13:37:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:01.296 13:37:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:01.577 [2024-07-12 13:37:50.019882] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:01.577 [2024-07-12 13:37:50.019932] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:01.577 [2024-07-12 13:37:50.019944] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:01.577 [2024-07-12 13:37:50.019956] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.577 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:01.873 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:01.873 "name": "Existed_Raid", 00:12:01.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:01.873 "strip_size_kb": 0, 00:12:01.873 "state": "configuring", 00:12:01.873 "raid_level": "raid1", 00:12:01.873 "superblock": false, 00:12:01.873 "num_base_bdevs": 2, 00:12:01.873 "num_base_bdevs_discovered": 0, 00:12:01.873 "num_base_bdevs_operational": 2, 00:12:01.873 "base_bdevs_list": [ 00:12:01.873 { 00:12:01.873 "name": "BaseBdev1", 00:12:01.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:01.873 "is_configured": false, 00:12:01.873 "data_offset": 0, 00:12:01.873 "data_size": 0 00:12:01.873 }, 00:12:01.873 { 00:12:01.873 "name": "BaseBdev2", 00:12:01.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:01.873 "is_configured": false, 00:12:01.873 "data_offset": 0, 00:12:01.873 "data_size": 0 00:12:01.873 } 00:12:01.873 ] 00:12:01.873 }' 00:12:01.873 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:01.873 13:37:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.483 13:37:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:02.790 [2024-07-12 13:37:51.114653] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:02.790 [2024-07-12 13:37:51.114686] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb36330 name Existed_Raid, state configuring 00:12:02.790 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:02.790 [2024-07-12 13:37:51.363317] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:02.790 [2024-07-12 13:37:51.363345] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:02.790 [2024-07-12 13:37:51.363355] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:02.790 [2024-07-12 13:37:51.363366] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:03.053 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:03.053 [2024-07-12 13:37:51.613891] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:03.053 BaseBdev1 00:12:03.331 13:37:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:03.331 13:37:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:03.331 13:37:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:03.331 13:37:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:03.331 13:37:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:03.331 13:37:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:03.331 13:37:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:03.331 13:37:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:03.600 [ 00:12:03.600 { 00:12:03.600 "name": "BaseBdev1", 00:12:03.600 "aliases": [ 00:12:03.600 "bca564dc-b736-442c-aba1-fb300dad3f2c" 00:12:03.600 ], 00:12:03.600 "product_name": "Malloc disk", 00:12:03.600 "block_size": 512, 00:12:03.600 "num_blocks": 65536, 00:12:03.600 "uuid": "bca564dc-b736-442c-aba1-fb300dad3f2c", 00:12:03.600 "assigned_rate_limits": { 00:12:03.600 "rw_ios_per_sec": 0, 00:12:03.600 "rw_mbytes_per_sec": 0, 00:12:03.600 "r_mbytes_per_sec": 0, 00:12:03.600 "w_mbytes_per_sec": 0 00:12:03.600 }, 00:12:03.600 "claimed": true, 00:12:03.600 "claim_type": "exclusive_write", 00:12:03.600 "zoned": false, 00:12:03.600 "supported_io_types": { 00:12:03.600 "read": true, 00:12:03.600 "write": true, 00:12:03.600 "unmap": true, 00:12:03.600 "flush": true, 00:12:03.600 "reset": true, 00:12:03.600 "nvme_admin": false, 00:12:03.600 "nvme_io": false, 00:12:03.600 "nvme_io_md": false, 00:12:03.600 "write_zeroes": true, 00:12:03.600 "zcopy": true, 00:12:03.600 "get_zone_info": false, 00:12:03.600 "zone_management": false, 00:12:03.600 "zone_append": false, 00:12:03.600 "compare": false, 00:12:03.600 "compare_and_write": false, 00:12:03.600 "abort": true, 00:12:03.600 "seek_hole": false, 00:12:03.600 "seek_data": false, 00:12:03.600 "copy": true, 00:12:03.600 "nvme_iov_md": false 00:12:03.600 }, 00:12:03.600 "memory_domains": [ 00:12:03.600 { 00:12:03.600 "dma_device_id": "system", 00:12:03.600 "dma_device_type": 1 00:12:03.600 }, 00:12:03.600 { 00:12:03.600 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.600 "dma_device_type": 2 00:12:03.600 } 00:12:03.600 ], 00:12:03.600 "driver_specific": {} 00:12:03.600 } 00:12:03.600 ] 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:03.601 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:03.879 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:03.879 "name": "Existed_Raid", 00:12:03.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:03.879 "strip_size_kb": 0, 00:12:03.879 "state": "configuring", 00:12:03.879 "raid_level": "raid1", 00:12:03.879 "superblock": false, 00:12:03.879 "num_base_bdevs": 2, 00:12:03.879 "num_base_bdevs_discovered": 1, 00:12:03.879 "num_base_bdevs_operational": 2, 00:12:03.879 "base_bdevs_list": [ 00:12:03.879 { 00:12:03.879 "name": "BaseBdev1", 00:12:03.879 "uuid": "bca564dc-b736-442c-aba1-fb300dad3f2c", 00:12:03.879 "is_configured": true, 00:12:03.879 "data_offset": 0, 00:12:03.879 "data_size": 65536 00:12:03.879 }, 00:12:03.879 { 00:12:03.879 "name": "BaseBdev2", 00:12:03.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:03.879 "is_configured": false, 00:12:03.879 "data_offset": 0, 00:12:03.879 "data_size": 0 00:12:03.879 } 00:12:03.879 ] 00:12:03.879 }' 00:12:03.879 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:03.879 13:37:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:04.532 13:37:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:04.827 [2024-07-12 13:37:53.145951] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:04.827 [2024-07-12 13:37:53.145993] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb35c20 name Existed_Raid, state configuring 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:04.827 [2024-07-12 13:37:53.322435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:04.827 [2024-07-12 13:37:53.323899] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:04.827 [2024-07-12 13:37:53.323939] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.827 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:05.143 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:05.143 "name": "Existed_Raid", 00:12:05.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.143 "strip_size_kb": 0, 00:12:05.143 "state": "configuring", 00:12:05.143 "raid_level": "raid1", 00:12:05.143 "superblock": false, 00:12:05.143 "num_base_bdevs": 2, 00:12:05.143 "num_base_bdevs_discovered": 1, 00:12:05.143 "num_base_bdevs_operational": 2, 00:12:05.143 "base_bdevs_list": [ 00:12:05.143 { 00:12:05.143 "name": "BaseBdev1", 00:12:05.143 "uuid": "bca564dc-b736-442c-aba1-fb300dad3f2c", 00:12:05.143 "is_configured": true, 00:12:05.143 "data_offset": 0, 00:12:05.143 "data_size": 65536 00:12:05.143 }, 00:12:05.143 { 00:12:05.143 "name": "BaseBdev2", 00:12:05.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.143 "is_configured": false, 00:12:05.143 "data_offset": 0, 00:12:05.143 "data_size": 0 00:12:05.143 } 00:12:05.143 ] 00:12:05.143 }' 00:12:05.143 13:37:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:05.143 13:37:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.712 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:05.969 [2024-07-12 13:37:54.372544] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:05.970 [2024-07-12 13:37:54.372582] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb36a10 00:12:05.970 [2024-07-12 13:37:54.372590] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:05.970 [2024-07-12 13:37:54.372782] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcda4d0 00:12:05.970 [2024-07-12 13:37:54.372905] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb36a10 00:12:05.970 [2024-07-12 13:37:54.372916] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb36a10 00:12:05.970 [2024-07-12 13:37:54.373100] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:05.970 BaseBdev2 00:12:05.970 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:05.970 13:37:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:05.970 13:37:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:05.970 13:37:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:05.970 13:37:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:05.970 13:37:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:05.970 13:37:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:06.228 [ 00:12:06.228 { 00:12:06.228 "name": "BaseBdev2", 00:12:06.228 "aliases": [ 00:12:06.228 "52af24f7-c1c4-44ce-86e0-fc2a80d9043c" 00:12:06.228 ], 00:12:06.228 "product_name": "Malloc disk", 00:12:06.228 "block_size": 512, 00:12:06.228 "num_blocks": 65536, 00:12:06.228 "uuid": "52af24f7-c1c4-44ce-86e0-fc2a80d9043c", 00:12:06.228 "assigned_rate_limits": { 00:12:06.228 "rw_ios_per_sec": 0, 00:12:06.228 "rw_mbytes_per_sec": 0, 00:12:06.228 "r_mbytes_per_sec": 0, 00:12:06.228 "w_mbytes_per_sec": 0 00:12:06.228 }, 00:12:06.228 "claimed": true, 00:12:06.228 "claim_type": "exclusive_write", 00:12:06.228 "zoned": false, 00:12:06.228 "supported_io_types": { 00:12:06.228 "read": true, 00:12:06.228 "write": true, 00:12:06.228 "unmap": true, 00:12:06.228 "flush": true, 00:12:06.228 "reset": true, 00:12:06.228 "nvme_admin": false, 00:12:06.228 "nvme_io": false, 00:12:06.228 "nvme_io_md": false, 00:12:06.228 "write_zeroes": true, 00:12:06.228 "zcopy": true, 00:12:06.228 "get_zone_info": false, 00:12:06.228 "zone_management": false, 00:12:06.228 "zone_append": false, 00:12:06.228 "compare": false, 00:12:06.228 "compare_and_write": false, 00:12:06.228 "abort": true, 00:12:06.228 "seek_hole": false, 00:12:06.228 "seek_data": false, 00:12:06.228 "copy": true, 00:12:06.228 "nvme_iov_md": false 00:12:06.228 }, 00:12:06.228 "memory_domains": [ 00:12:06.228 { 00:12:06.228 "dma_device_id": "system", 00:12:06.228 "dma_device_type": 1 00:12:06.228 }, 00:12:06.228 { 00:12:06.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:06.228 "dma_device_type": 2 00:12:06.228 } 00:12:06.228 ], 00:12:06.228 "driver_specific": {} 00:12:06.228 } 00:12:06.228 ] 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.228 13:37:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:06.486 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:06.486 "name": "Existed_Raid", 00:12:06.486 "uuid": "7de3a701-5e3f-4dea-bf01-737075056837", 00:12:06.486 "strip_size_kb": 0, 00:12:06.486 "state": "online", 00:12:06.486 "raid_level": "raid1", 00:12:06.486 "superblock": false, 00:12:06.486 "num_base_bdevs": 2, 00:12:06.486 "num_base_bdevs_discovered": 2, 00:12:06.486 "num_base_bdevs_operational": 2, 00:12:06.486 "base_bdevs_list": [ 00:12:06.486 { 00:12:06.486 "name": "BaseBdev1", 00:12:06.486 "uuid": "bca564dc-b736-442c-aba1-fb300dad3f2c", 00:12:06.486 "is_configured": true, 00:12:06.486 "data_offset": 0, 00:12:06.486 "data_size": 65536 00:12:06.486 }, 00:12:06.486 { 00:12:06.486 "name": "BaseBdev2", 00:12:06.486 "uuid": "52af24f7-c1c4-44ce-86e0-fc2a80d9043c", 00:12:06.486 "is_configured": true, 00:12:06.486 "data_offset": 0, 00:12:06.486 "data_size": 65536 00:12:06.486 } 00:12:06.486 ] 00:12:06.486 }' 00:12:06.486 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:06.486 13:37:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.053 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:07.053 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:07.053 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:07.053 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:07.053 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:07.053 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:07.053 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:07.053 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:07.312 [2024-07-12 13:37:55.852757] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:07.312 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:07.312 "name": "Existed_Raid", 00:12:07.312 "aliases": [ 00:12:07.312 "7de3a701-5e3f-4dea-bf01-737075056837" 00:12:07.312 ], 00:12:07.312 "product_name": "Raid Volume", 00:12:07.312 "block_size": 512, 00:12:07.312 "num_blocks": 65536, 00:12:07.312 "uuid": "7de3a701-5e3f-4dea-bf01-737075056837", 00:12:07.312 "assigned_rate_limits": { 00:12:07.312 "rw_ios_per_sec": 0, 00:12:07.312 "rw_mbytes_per_sec": 0, 00:12:07.312 "r_mbytes_per_sec": 0, 00:12:07.312 "w_mbytes_per_sec": 0 00:12:07.312 }, 00:12:07.312 "claimed": false, 00:12:07.312 "zoned": false, 00:12:07.312 "supported_io_types": { 00:12:07.312 "read": true, 00:12:07.312 "write": true, 00:12:07.312 "unmap": false, 00:12:07.312 "flush": false, 00:12:07.312 "reset": true, 00:12:07.312 "nvme_admin": false, 00:12:07.312 "nvme_io": false, 00:12:07.312 "nvme_io_md": false, 00:12:07.312 "write_zeroes": true, 00:12:07.312 "zcopy": false, 00:12:07.312 "get_zone_info": false, 00:12:07.312 "zone_management": false, 00:12:07.312 "zone_append": false, 00:12:07.312 "compare": false, 00:12:07.312 "compare_and_write": false, 00:12:07.312 "abort": false, 00:12:07.312 "seek_hole": false, 00:12:07.312 "seek_data": false, 00:12:07.312 "copy": false, 00:12:07.312 "nvme_iov_md": false 00:12:07.312 }, 00:12:07.312 "memory_domains": [ 00:12:07.312 { 00:12:07.312 "dma_device_id": "system", 00:12:07.312 "dma_device_type": 1 00:12:07.312 }, 00:12:07.312 { 00:12:07.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.312 "dma_device_type": 2 00:12:07.312 }, 00:12:07.312 { 00:12:07.312 "dma_device_id": "system", 00:12:07.312 "dma_device_type": 1 00:12:07.312 }, 00:12:07.312 { 00:12:07.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.312 "dma_device_type": 2 00:12:07.312 } 00:12:07.312 ], 00:12:07.312 "driver_specific": { 00:12:07.312 "raid": { 00:12:07.312 "uuid": "7de3a701-5e3f-4dea-bf01-737075056837", 00:12:07.312 "strip_size_kb": 0, 00:12:07.312 "state": "online", 00:12:07.312 "raid_level": "raid1", 00:12:07.312 "superblock": false, 00:12:07.312 "num_base_bdevs": 2, 00:12:07.312 "num_base_bdevs_discovered": 2, 00:12:07.312 "num_base_bdevs_operational": 2, 00:12:07.312 "base_bdevs_list": [ 00:12:07.312 { 00:12:07.312 "name": "BaseBdev1", 00:12:07.312 "uuid": "bca564dc-b736-442c-aba1-fb300dad3f2c", 00:12:07.312 "is_configured": true, 00:12:07.312 "data_offset": 0, 00:12:07.312 "data_size": 65536 00:12:07.312 }, 00:12:07.312 { 00:12:07.312 "name": "BaseBdev2", 00:12:07.312 "uuid": "52af24f7-c1c4-44ce-86e0-fc2a80d9043c", 00:12:07.312 "is_configured": true, 00:12:07.312 "data_offset": 0, 00:12:07.312 "data_size": 65536 00:12:07.312 } 00:12:07.312 ] 00:12:07.312 } 00:12:07.312 } 00:12:07.312 }' 00:12:07.312 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:07.571 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:07.571 BaseBdev2' 00:12:07.571 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:07.571 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:07.571 13:37:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:07.829 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:07.830 "name": "BaseBdev1", 00:12:07.830 "aliases": [ 00:12:07.830 "bca564dc-b736-442c-aba1-fb300dad3f2c" 00:12:07.830 ], 00:12:07.830 "product_name": "Malloc disk", 00:12:07.830 "block_size": 512, 00:12:07.830 "num_blocks": 65536, 00:12:07.830 "uuid": "bca564dc-b736-442c-aba1-fb300dad3f2c", 00:12:07.830 "assigned_rate_limits": { 00:12:07.830 "rw_ios_per_sec": 0, 00:12:07.830 "rw_mbytes_per_sec": 0, 00:12:07.830 "r_mbytes_per_sec": 0, 00:12:07.830 "w_mbytes_per_sec": 0 00:12:07.830 }, 00:12:07.830 "claimed": true, 00:12:07.830 "claim_type": "exclusive_write", 00:12:07.830 "zoned": false, 00:12:07.830 "supported_io_types": { 00:12:07.830 "read": true, 00:12:07.830 "write": true, 00:12:07.830 "unmap": true, 00:12:07.830 "flush": true, 00:12:07.830 "reset": true, 00:12:07.830 "nvme_admin": false, 00:12:07.830 "nvme_io": false, 00:12:07.830 "nvme_io_md": false, 00:12:07.830 "write_zeroes": true, 00:12:07.830 "zcopy": true, 00:12:07.830 "get_zone_info": false, 00:12:07.830 "zone_management": false, 00:12:07.830 "zone_append": false, 00:12:07.830 "compare": false, 00:12:07.830 "compare_and_write": false, 00:12:07.830 "abort": true, 00:12:07.830 "seek_hole": false, 00:12:07.830 "seek_data": false, 00:12:07.830 "copy": true, 00:12:07.830 "nvme_iov_md": false 00:12:07.830 }, 00:12:07.830 "memory_domains": [ 00:12:07.830 { 00:12:07.830 "dma_device_id": "system", 00:12:07.830 "dma_device_type": 1 00:12:07.830 }, 00:12:07.830 { 00:12:07.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.830 "dma_device_type": 2 00:12:07.830 } 00:12:07.830 ], 00:12:07.830 "driver_specific": {} 00:12:07.830 }' 00:12:07.830 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:07.830 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:07.830 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:07.830 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:07.830 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:07.830 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:07.830 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:07.830 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.088 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:08.088 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.088 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.088 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:08.088 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:08.088 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:08.088 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:08.347 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:08.347 "name": "BaseBdev2", 00:12:08.347 "aliases": [ 00:12:08.347 "52af24f7-c1c4-44ce-86e0-fc2a80d9043c" 00:12:08.347 ], 00:12:08.347 "product_name": "Malloc disk", 00:12:08.347 "block_size": 512, 00:12:08.347 "num_blocks": 65536, 00:12:08.347 "uuid": "52af24f7-c1c4-44ce-86e0-fc2a80d9043c", 00:12:08.347 "assigned_rate_limits": { 00:12:08.347 "rw_ios_per_sec": 0, 00:12:08.347 "rw_mbytes_per_sec": 0, 00:12:08.347 "r_mbytes_per_sec": 0, 00:12:08.347 "w_mbytes_per_sec": 0 00:12:08.347 }, 00:12:08.347 "claimed": true, 00:12:08.347 "claim_type": "exclusive_write", 00:12:08.347 "zoned": false, 00:12:08.347 "supported_io_types": { 00:12:08.347 "read": true, 00:12:08.347 "write": true, 00:12:08.347 "unmap": true, 00:12:08.347 "flush": true, 00:12:08.347 "reset": true, 00:12:08.347 "nvme_admin": false, 00:12:08.347 "nvme_io": false, 00:12:08.347 "nvme_io_md": false, 00:12:08.347 "write_zeroes": true, 00:12:08.347 "zcopy": true, 00:12:08.347 "get_zone_info": false, 00:12:08.347 "zone_management": false, 00:12:08.347 "zone_append": false, 00:12:08.347 "compare": false, 00:12:08.347 "compare_and_write": false, 00:12:08.347 "abort": true, 00:12:08.347 "seek_hole": false, 00:12:08.347 "seek_data": false, 00:12:08.347 "copy": true, 00:12:08.347 "nvme_iov_md": false 00:12:08.347 }, 00:12:08.347 "memory_domains": [ 00:12:08.347 { 00:12:08.347 "dma_device_id": "system", 00:12:08.347 "dma_device_type": 1 00:12:08.347 }, 00:12:08.347 { 00:12:08.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.347 "dma_device_type": 2 00:12:08.347 } 00:12:08.347 ], 00:12:08.347 "driver_specific": {} 00:12:08.347 }' 00:12:08.347 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.347 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.347 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:08.347 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.347 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.606 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:08.606 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.606 13:37:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.606 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:08.606 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.606 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.606 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:08.606 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:08.865 [2024-07-12 13:37:57.364561] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:08.865 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:08.866 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.866 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:09.124 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.124 "name": "Existed_Raid", 00:12:09.124 "uuid": "7de3a701-5e3f-4dea-bf01-737075056837", 00:12:09.124 "strip_size_kb": 0, 00:12:09.124 "state": "online", 00:12:09.124 "raid_level": "raid1", 00:12:09.124 "superblock": false, 00:12:09.124 "num_base_bdevs": 2, 00:12:09.124 "num_base_bdevs_discovered": 1, 00:12:09.124 "num_base_bdevs_operational": 1, 00:12:09.124 "base_bdevs_list": [ 00:12:09.124 { 00:12:09.124 "name": null, 00:12:09.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:09.124 "is_configured": false, 00:12:09.124 "data_offset": 0, 00:12:09.124 "data_size": 65536 00:12:09.124 }, 00:12:09.124 { 00:12:09.124 "name": "BaseBdev2", 00:12:09.124 "uuid": "52af24f7-c1c4-44ce-86e0-fc2a80d9043c", 00:12:09.124 "is_configured": true, 00:12:09.124 "data_offset": 0, 00:12:09.124 "data_size": 65536 00:12:09.124 } 00:12:09.124 ] 00:12:09.124 }' 00:12:09.125 13:37:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.125 13:37:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:09.692 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:09.692 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:09.693 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.693 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:09.951 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:09.951 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:09.951 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:10.210 [2024-07-12 13:37:58.721117] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:10.210 [2024-07-12 13:37:58.721201] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:10.210 [2024-07-12 13:37:58.732157] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:10.210 [2024-07-12 13:37:58.732194] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:10.210 [2024-07-12 13:37:58.732206] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb36a10 name Existed_Raid, state offline 00:12:10.210 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:10.210 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:10.210 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.210 13:37:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:10.469 13:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:10.469 13:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:10.469 13:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:10.469 13:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 439610 00:12:10.469 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 439610 ']' 00:12:10.469 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 439610 00:12:10.469 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:10.469 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:10.469 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 439610 00:12:10.728 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:10.728 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:10.728 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 439610' 00:12:10.728 killing process with pid 439610 00:12:10.728 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 439610 00:12:10.728 [2024-07-12 13:37:59.065935] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:10.728 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 439610 00:12:10.728 [2024-07-12 13:37:59.066850] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:10.728 13:37:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:10.728 00:12:10.728 real 0m10.477s 00:12:10.728 user 0m18.642s 00:12:10.728 sys 0m1.950s 00:12:10.728 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:10.728 13:37:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.728 ************************************ 00:12:10.728 END TEST raid_state_function_test 00:12:10.728 ************************************ 00:12:10.987 13:37:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:10.987 13:37:59 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:12:10.987 13:37:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:10.987 13:37:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:10.987 13:37:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:10.987 ************************************ 00:12:10.987 START TEST raid_state_function_test_sb 00:12:10.987 ************************************ 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=441171 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 441171' 00:12:10.987 Process raid pid: 441171 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 441171 /var/tmp/spdk-raid.sock 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 441171 ']' 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:10.987 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:10.987 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:10.988 13:37:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:10.988 [2024-07-12 13:37:59.417279] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:12:10.988 [2024-07-12 13:37:59.417343] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:10.988 [2024-07-12 13:37:59.548652] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:11.247 [2024-07-12 13:37:59.655724] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:11.247 [2024-07-12 13:37:59.720696] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:11.247 [2024-07-12 13:37:59.720733] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:11.815 13:38:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:11.815 13:38:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:11.815 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:12.384 [2024-07-12 13:38:00.833412] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:12.384 [2024-07-12 13:38:00.833457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:12.384 [2024-07-12 13:38:00.833467] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:12.384 [2024-07-12 13:38:00.833479] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.384 13:38:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:12.951 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:12.951 "name": "Existed_Raid", 00:12:12.952 "uuid": "13db3098-962c-4192-9947-9dece3170f95", 00:12:12.952 "strip_size_kb": 0, 00:12:12.952 "state": "configuring", 00:12:12.952 "raid_level": "raid1", 00:12:12.952 "superblock": true, 00:12:12.952 "num_base_bdevs": 2, 00:12:12.952 "num_base_bdevs_discovered": 0, 00:12:12.952 "num_base_bdevs_operational": 2, 00:12:12.952 "base_bdevs_list": [ 00:12:12.952 { 00:12:12.952 "name": "BaseBdev1", 00:12:12.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:12.952 "is_configured": false, 00:12:12.952 "data_offset": 0, 00:12:12.952 "data_size": 0 00:12:12.952 }, 00:12:12.952 { 00:12:12.952 "name": "BaseBdev2", 00:12:12.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:12.952 "is_configured": false, 00:12:12.952 "data_offset": 0, 00:12:12.952 "data_size": 0 00:12:12.952 } 00:12:12.952 ] 00:12:12.952 }' 00:12:12.952 13:38:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:12.952 13:38:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:13.889 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:13.889 [2024-07-12 13:38:02.437513] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:13.889 [2024-07-12 13:38:02.437543] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a29330 name Existed_Raid, state configuring 00:12:13.889 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:14.148 [2024-07-12 13:38:02.686178] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:14.148 [2024-07-12 13:38:02.686207] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:14.148 [2024-07-12 13:38:02.686217] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:14.148 [2024-07-12 13:38:02.686228] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:14.148 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:14.408 [2024-07-12 13:38:02.944794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:14.408 BaseBdev1 00:12:14.408 13:38:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:14.408 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:14.408 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:14.408 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:14.408 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:14.408 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:14.408 13:38:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:14.666 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:14.924 [ 00:12:14.924 { 00:12:14.924 "name": "BaseBdev1", 00:12:14.924 "aliases": [ 00:12:14.924 "1eacc60d-1063-4f91-9a31-d400239198a8" 00:12:14.924 ], 00:12:14.924 "product_name": "Malloc disk", 00:12:14.924 "block_size": 512, 00:12:14.924 "num_blocks": 65536, 00:12:14.924 "uuid": "1eacc60d-1063-4f91-9a31-d400239198a8", 00:12:14.924 "assigned_rate_limits": { 00:12:14.924 "rw_ios_per_sec": 0, 00:12:14.924 "rw_mbytes_per_sec": 0, 00:12:14.924 "r_mbytes_per_sec": 0, 00:12:14.924 "w_mbytes_per_sec": 0 00:12:14.924 }, 00:12:14.924 "claimed": true, 00:12:14.924 "claim_type": "exclusive_write", 00:12:14.924 "zoned": false, 00:12:14.924 "supported_io_types": { 00:12:14.924 "read": true, 00:12:14.924 "write": true, 00:12:14.924 "unmap": true, 00:12:14.924 "flush": true, 00:12:14.924 "reset": true, 00:12:14.924 "nvme_admin": false, 00:12:14.924 "nvme_io": false, 00:12:14.924 "nvme_io_md": false, 00:12:14.924 "write_zeroes": true, 00:12:14.924 "zcopy": true, 00:12:14.924 "get_zone_info": false, 00:12:14.924 "zone_management": false, 00:12:14.924 "zone_append": false, 00:12:14.924 "compare": false, 00:12:14.924 "compare_and_write": false, 00:12:14.924 "abort": true, 00:12:14.924 "seek_hole": false, 00:12:14.924 "seek_data": false, 00:12:14.924 "copy": true, 00:12:14.924 "nvme_iov_md": false 00:12:14.924 }, 00:12:14.924 "memory_domains": [ 00:12:14.924 { 00:12:14.924 "dma_device_id": "system", 00:12:14.924 "dma_device_type": 1 00:12:14.924 }, 00:12:14.924 { 00:12:14.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.924 "dma_device_type": 2 00:12:14.924 } 00:12:14.924 ], 00:12:14.924 "driver_specific": {} 00:12:14.924 } 00:12:14.924 ] 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.924 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:15.183 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.183 "name": "Existed_Raid", 00:12:15.183 "uuid": "3e2baf4a-e6b2-4e4d-9a0b-7ead04b8ab9f", 00:12:15.183 "strip_size_kb": 0, 00:12:15.183 "state": "configuring", 00:12:15.183 "raid_level": "raid1", 00:12:15.183 "superblock": true, 00:12:15.183 "num_base_bdevs": 2, 00:12:15.183 "num_base_bdevs_discovered": 1, 00:12:15.183 "num_base_bdevs_operational": 2, 00:12:15.183 "base_bdevs_list": [ 00:12:15.183 { 00:12:15.183 "name": "BaseBdev1", 00:12:15.183 "uuid": "1eacc60d-1063-4f91-9a31-d400239198a8", 00:12:15.183 "is_configured": true, 00:12:15.183 "data_offset": 2048, 00:12:15.183 "data_size": 63488 00:12:15.183 }, 00:12:15.183 { 00:12:15.183 "name": "BaseBdev2", 00:12:15.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:15.183 "is_configured": false, 00:12:15.183 "data_offset": 0, 00:12:15.183 "data_size": 0 00:12:15.183 } 00:12:15.183 ] 00:12:15.183 }' 00:12:15.183 13:38:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.183 13:38:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:15.752 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:16.011 [2024-07-12 13:38:04.521154] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:16.011 [2024-07-12 13:38:04.521192] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a28c20 name Existed_Raid, state configuring 00:12:16.011 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:16.270 [2024-07-12 13:38:04.765833] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:16.270 [2024-07-12 13:38:04.767310] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:16.270 [2024-07-12 13:38:04.767342] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.270 13:38:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:16.529 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:16.529 "name": "Existed_Raid", 00:12:16.529 "uuid": "441823f8-33fd-4dc4-bb40-b0c7f24b028f", 00:12:16.529 "strip_size_kb": 0, 00:12:16.529 "state": "configuring", 00:12:16.529 "raid_level": "raid1", 00:12:16.529 "superblock": true, 00:12:16.529 "num_base_bdevs": 2, 00:12:16.529 "num_base_bdevs_discovered": 1, 00:12:16.529 "num_base_bdevs_operational": 2, 00:12:16.529 "base_bdevs_list": [ 00:12:16.529 { 00:12:16.529 "name": "BaseBdev1", 00:12:16.529 "uuid": "1eacc60d-1063-4f91-9a31-d400239198a8", 00:12:16.529 "is_configured": true, 00:12:16.529 "data_offset": 2048, 00:12:16.529 "data_size": 63488 00:12:16.529 }, 00:12:16.529 { 00:12:16.529 "name": "BaseBdev2", 00:12:16.529 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.529 "is_configured": false, 00:12:16.529 "data_offset": 0, 00:12:16.529 "data_size": 0 00:12:16.529 } 00:12:16.529 ] 00:12:16.529 }' 00:12:16.529 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:16.529 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:17.095 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:17.354 [2024-07-12 13:38:05.856031] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:17.354 [2024-07-12 13:38:05.856174] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a29a10 00:12:17.354 [2024-07-12 13:38:05.856188] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:17.354 [2024-07-12 13:38:05.856364] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a28b70 00:12:17.354 [2024-07-12 13:38:05.856486] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a29a10 00:12:17.354 [2024-07-12 13:38:05.856497] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a29a10 00:12:17.354 [2024-07-12 13:38:05.856586] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:17.354 BaseBdev2 00:12:17.354 13:38:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:17.355 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:17.355 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:17.355 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:17.355 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:17.355 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:17.355 13:38:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:17.614 13:38:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:17.873 [ 00:12:17.873 { 00:12:17.873 "name": "BaseBdev2", 00:12:17.873 "aliases": [ 00:12:17.873 "43718ad3-7d0c-40f1-b2cc-108be37f6f64" 00:12:17.873 ], 00:12:17.873 "product_name": "Malloc disk", 00:12:17.873 "block_size": 512, 00:12:17.873 "num_blocks": 65536, 00:12:17.873 "uuid": "43718ad3-7d0c-40f1-b2cc-108be37f6f64", 00:12:17.873 "assigned_rate_limits": { 00:12:17.873 "rw_ios_per_sec": 0, 00:12:17.873 "rw_mbytes_per_sec": 0, 00:12:17.873 "r_mbytes_per_sec": 0, 00:12:17.873 "w_mbytes_per_sec": 0 00:12:17.873 }, 00:12:17.873 "claimed": true, 00:12:17.873 "claim_type": "exclusive_write", 00:12:17.873 "zoned": false, 00:12:17.873 "supported_io_types": { 00:12:17.873 "read": true, 00:12:17.873 "write": true, 00:12:17.873 "unmap": true, 00:12:17.873 "flush": true, 00:12:17.873 "reset": true, 00:12:17.873 "nvme_admin": false, 00:12:17.873 "nvme_io": false, 00:12:17.873 "nvme_io_md": false, 00:12:17.873 "write_zeroes": true, 00:12:17.873 "zcopy": true, 00:12:17.873 "get_zone_info": false, 00:12:17.873 "zone_management": false, 00:12:17.873 "zone_append": false, 00:12:17.873 "compare": false, 00:12:17.873 "compare_and_write": false, 00:12:17.873 "abort": true, 00:12:17.873 "seek_hole": false, 00:12:17.873 "seek_data": false, 00:12:17.873 "copy": true, 00:12:17.873 "nvme_iov_md": false 00:12:17.873 }, 00:12:17.873 "memory_domains": [ 00:12:17.873 { 00:12:17.873 "dma_device_id": "system", 00:12:17.873 "dma_device_type": 1 00:12:17.873 }, 00:12:17.874 { 00:12:17.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:17.874 "dma_device_type": 2 00:12:17.874 } 00:12:17.874 ], 00:12:17.874 "driver_specific": {} 00:12:17.874 } 00:12:17.874 ] 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.874 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:18.133 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.133 "name": "Existed_Raid", 00:12:18.133 "uuid": "441823f8-33fd-4dc4-bb40-b0c7f24b028f", 00:12:18.133 "strip_size_kb": 0, 00:12:18.133 "state": "online", 00:12:18.133 "raid_level": "raid1", 00:12:18.133 "superblock": true, 00:12:18.133 "num_base_bdevs": 2, 00:12:18.133 "num_base_bdevs_discovered": 2, 00:12:18.133 "num_base_bdevs_operational": 2, 00:12:18.133 "base_bdevs_list": [ 00:12:18.133 { 00:12:18.133 "name": "BaseBdev1", 00:12:18.133 "uuid": "1eacc60d-1063-4f91-9a31-d400239198a8", 00:12:18.133 "is_configured": true, 00:12:18.133 "data_offset": 2048, 00:12:18.133 "data_size": 63488 00:12:18.133 }, 00:12:18.133 { 00:12:18.133 "name": "BaseBdev2", 00:12:18.133 "uuid": "43718ad3-7d0c-40f1-b2cc-108be37f6f64", 00:12:18.133 "is_configured": true, 00:12:18.133 "data_offset": 2048, 00:12:18.133 "data_size": 63488 00:12:18.133 } 00:12:18.133 ] 00:12:18.133 }' 00:12:18.133 13:38:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.133 13:38:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:18.702 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:18.702 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:18.702 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:18.702 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:18.702 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:18.702 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:18.702 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:18.702 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:18.961 [2024-07-12 13:38:07.452541] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:18.961 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:18.961 "name": "Existed_Raid", 00:12:18.961 "aliases": [ 00:12:18.961 "441823f8-33fd-4dc4-bb40-b0c7f24b028f" 00:12:18.961 ], 00:12:18.961 "product_name": "Raid Volume", 00:12:18.961 "block_size": 512, 00:12:18.961 "num_blocks": 63488, 00:12:18.961 "uuid": "441823f8-33fd-4dc4-bb40-b0c7f24b028f", 00:12:18.961 "assigned_rate_limits": { 00:12:18.961 "rw_ios_per_sec": 0, 00:12:18.961 "rw_mbytes_per_sec": 0, 00:12:18.961 "r_mbytes_per_sec": 0, 00:12:18.961 "w_mbytes_per_sec": 0 00:12:18.961 }, 00:12:18.961 "claimed": false, 00:12:18.961 "zoned": false, 00:12:18.961 "supported_io_types": { 00:12:18.961 "read": true, 00:12:18.961 "write": true, 00:12:18.961 "unmap": false, 00:12:18.961 "flush": false, 00:12:18.961 "reset": true, 00:12:18.961 "nvme_admin": false, 00:12:18.961 "nvme_io": false, 00:12:18.961 "nvme_io_md": false, 00:12:18.961 "write_zeroes": true, 00:12:18.961 "zcopy": false, 00:12:18.961 "get_zone_info": false, 00:12:18.961 "zone_management": false, 00:12:18.961 "zone_append": false, 00:12:18.961 "compare": false, 00:12:18.961 "compare_and_write": false, 00:12:18.961 "abort": false, 00:12:18.961 "seek_hole": false, 00:12:18.961 "seek_data": false, 00:12:18.961 "copy": false, 00:12:18.961 "nvme_iov_md": false 00:12:18.961 }, 00:12:18.962 "memory_domains": [ 00:12:18.962 { 00:12:18.962 "dma_device_id": "system", 00:12:18.962 "dma_device_type": 1 00:12:18.962 }, 00:12:18.962 { 00:12:18.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.962 "dma_device_type": 2 00:12:18.962 }, 00:12:18.962 { 00:12:18.962 "dma_device_id": "system", 00:12:18.962 "dma_device_type": 1 00:12:18.962 }, 00:12:18.962 { 00:12:18.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.962 "dma_device_type": 2 00:12:18.962 } 00:12:18.962 ], 00:12:18.962 "driver_specific": { 00:12:18.962 "raid": { 00:12:18.962 "uuid": "441823f8-33fd-4dc4-bb40-b0c7f24b028f", 00:12:18.962 "strip_size_kb": 0, 00:12:18.962 "state": "online", 00:12:18.962 "raid_level": "raid1", 00:12:18.962 "superblock": true, 00:12:18.962 "num_base_bdevs": 2, 00:12:18.962 "num_base_bdevs_discovered": 2, 00:12:18.962 "num_base_bdevs_operational": 2, 00:12:18.962 "base_bdevs_list": [ 00:12:18.962 { 00:12:18.962 "name": "BaseBdev1", 00:12:18.962 "uuid": "1eacc60d-1063-4f91-9a31-d400239198a8", 00:12:18.962 "is_configured": true, 00:12:18.962 "data_offset": 2048, 00:12:18.962 "data_size": 63488 00:12:18.962 }, 00:12:18.962 { 00:12:18.962 "name": "BaseBdev2", 00:12:18.962 "uuid": "43718ad3-7d0c-40f1-b2cc-108be37f6f64", 00:12:18.962 "is_configured": true, 00:12:18.962 "data_offset": 2048, 00:12:18.962 "data_size": 63488 00:12:18.962 } 00:12:18.962 ] 00:12:18.962 } 00:12:18.962 } 00:12:18.962 }' 00:12:18.962 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:18.962 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:18.962 BaseBdev2' 00:12:18.962 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:18.962 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:18.962 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:19.221 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:19.221 "name": "BaseBdev1", 00:12:19.221 "aliases": [ 00:12:19.221 "1eacc60d-1063-4f91-9a31-d400239198a8" 00:12:19.221 ], 00:12:19.221 "product_name": "Malloc disk", 00:12:19.221 "block_size": 512, 00:12:19.221 "num_blocks": 65536, 00:12:19.221 "uuid": "1eacc60d-1063-4f91-9a31-d400239198a8", 00:12:19.221 "assigned_rate_limits": { 00:12:19.221 "rw_ios_per_sec": 0, 00:12:19.221 "rw_mbytes_per_sec": 0, 00:12:19.221 "r_mbytes_per_sec": 0, 00:12:19.221 "w_mbytes_per_sec": 0 00:12:19.221 }, 00:12:19.221 "claimed": true, 00:12:19.221 "claim_type": "exclusive_write", 00:12:19.221 "zoned": false, 00:12:19.221 "supported_io_types": { 00:12:19.221 "read": true, 00:12:19.221 "write": true, 00:12:19.221 "unmap": true, 00:12:19.221 "flush": true, 00:12:19.221 "reset": true, 00:12:19.221 "nvme_admin": false, 00:12:19.221 "nvme_io": false, 00:12:19.221 "nvme_io_md": false, 00:12:19.221 "write_zeroes": true, 00:12:19.221 "zcopy": true, 00:12:19.221 "get_zone_info": false, 00:12:19.221 "zone_management": false, 00:12:19.221 "zone_append": false, 00:12:19.221 "compare": false, 00:12:19.221 "compare_and_write": false, 00:12:19.221 "abort": true, 00:12:19.221 "seek_hole": false, 00:12:19.221 "seek_data": false, 00:12:19.221 "copy": true, 00:12:19.221 "nvme_iov_md": false 00:12:19.221 }, 00:12:19.221 "memory_domains": [ 00:12:19.221 { 00:12:19.221 "dma_device_id": "system", 00:12:19.221 "dma_device_type": 1 00:12:19.221 }, 00:12:19.221 { 00:12:19.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.222 "dma_device_type": 2 00:12:19.222 } 00:12:19.222 ], 00:12:19.222 "driver_specific": {} 00:12:19.222 }' 00:12:19.222 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.481 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.481 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:19.481 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.481 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.481 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:19.481 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.481 13:38:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.481 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:19.481 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.740 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.740 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:19.740 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:19.740 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:19.740 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:19.999 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:19.999 "name": "BaseBdev2", 00:12:19.999 "aliases": [ 00:12:19.999 "43718ad3-7d0c-40f1-b2cc-108be37f6f64" 00:12:19.999 ], 00:12:19.999 "product_name": "Malloc disk", 00:12:19.999 "block_size": 512, 00:12:19.999 "num_blocks": 65536, 00:12:19.999 "uuid": "43718ad3-7d0c-40f1-b2cc-108be37f6f64", 00:12:19.999 "assigned_rate_limits": { 00:12:19.999 "rw_ios_per_sec": 0, 00:12:19.999 "rw_mbytes_per_sec": 0, 00:12:19.999 "r_mbytes_per_sec": 0, 00:12:19.999 "w_mbytes_per_sec": 0 00:12:19.999 }, 00:12:19.999 "claimed": true, 00:12:19.999 "claim_type": "exclusive_write", 00:12:19.999 "zoned": false, 00:12:19.999 "supported_io_types": { 00:12:19.999 "read": true, 00:12:19.999 "write": true, 00:12:19.999 "unmap": true, 00:12:19.999 "flush": true, 00:12:19.999 "reset": true, 00:12:19.999 "nvme_admin": false, 00:12:19.999 "nvme_io": false, 00:12:19.999 "nvme_io_md": false, 00:12:19.999 "write_zeroes": true, 00:12:19.999 "zcopy": true, 00:12:19.999 "get_zone_info": false, 00:12:19.999 "zone_management": false, 00:12:19.999 "zone_append": false, 00:12:19.999 "compare": false, 00:12:19.999 "compare_and_write": false, 00:12:19.999 "abort": true, 00:12:19.999 "seek_hole": false, 00:12:19.999 "seek_data": false, 00:12:19.999 "copy": true, 00:12:19.999 "nvme_iov_md": false 00:12:19.999 }, 00:12:19.999 "memory_domains": [ 00:12:19.999 { 00:12:19.999 "dma_device_id": "system", 00:12:19.999 "dma_device_type": 1 00:12:19.999 }, 00:12:19.999 { 00:12:19.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.999 "dma_device_type": 2 00:12:19.999 } 00:12:19.999 ], 00:12:19.999 "driver_specific": {} 00:12:19.999 }' 00:12:19.999 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.999 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.999 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:19.999 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.999 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.999 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:19.999 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:20.259 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:20.259 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:20.259 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:20.259 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:20.259 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:20.259 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:20.518 [2024-07-12 13:38:08.960338] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.518 13:38:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:20.777 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.777 "name": "Existed_Raid", 00:12:20.777 "uuid": "441823f8-33fd-4dc4-bb40-b0c7f24b028f", 00:12:20.777 "strip_size_kb": 0, 00:12:20.777 "state": "online", 00:12:20.777 "raid_level": "raid1", 00:12:20.777 "superblock": true, 00:12:20.777 "num_base_bdevs": 2, 00:12:20.777 "num_base_bdevs_discovered": 1, 00:12:20.777 "num_base_bdevs_operational": 1, 00:12:20.777 "base_bdevs_list": [ 00:12:20.777 { 00:12:20.777 "name": null, 00:12:20.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.777 "is_configured": false, 00:12:20.777 "data_offset": 2048, 00:12:20.777 "data_size": 63488 00:12:20.777 }, 00:12:20.777 { 00:12:20.777 "name": "BaseBdev2", 00:12:20.777 "uuid": "43718ad3-7d0c-40f1-b2cc-108be37f6f64", 00:12:20.777 "is_configured": true, 00:12:20.777 "data_offset": 2048, 00:12:20.777 "data_size": 63488 00:12:20.777 } 00:12:20.777 ] 00:12:20.777 }' 00:12:20.777 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.777 13:38:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:21.345 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:21.345 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:21.345 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.345 13:38:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:21.604 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:21.604 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:21.604 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:21.863 [2024-07-12 13:38:10.297828] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:21.863 [2024-07-12 13:38:10.297911] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:21.863 [2024-07-12 13:38:10.308801] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:21.863 [2024-07-12 13:38:10.308835] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:21.863 [2024-07-12 13:38:10.308846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a29a10 name Existed_Raid, state offline 00:12:21.863 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:21.863 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:21.863 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.863 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 441171 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 441171 ']' 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 441171 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 441171 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 441171' 00:12:22.122 killing process with pid 441171 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 441171 00:12:22.122 [2024-07-12 13:38:10.630160] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:22.122 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 441171 00:12:22.122 [2024-07-12 13:38:10.631033] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:22.382 13:38:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:22.382 00:12:22.382 real 0m11.481s 00:12:22.382 user 0m20.511s 00:12:22.382 sys 0m2.107s 00:12:22.382 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:22.382 13:38:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:22.382 ************************************ 00:12:22.382 END TEST raid_state_function_test_sb 00:12:22.382 ************************************ 00:12:22.382 13:38:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:22.382 13:38:10 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:12:22.382 13:38:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:22.382 13:38:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:22.382 13:38:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:22.382 ************************************ 00:12:22.382 START TEST raid_superblock_test 00:12:22.382 ************************************ 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=442961 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 442961 /var/tmp/spdk-raid.sock 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 442961 ']' 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:22.382 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:22.382 13:38:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.641 [2024-07-12 13:38:10.993224] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:12:22.641 [2024-07-12 13:38:10.993293] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid442961 ] 00:12:22.641 [2024-07-12 13:38:11.123514] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:22.901 [2024-07-12 13:38:11.232430] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:22.901 [2024-07-12 13:38:11.312931] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:22.901 [2024-07-12 13:38:11.312969] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:23.470 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:23.728 malloc1 00:12:23.728 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:23.986 [2024-07-12 13:38:12.508540] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:23.986 [2024-07-12 13:38:12.508587] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:23.986 [2024-07-12 13:38:12.508607] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc0ce90 00:12:23.986 [2024-07-12 13:38:12.508619] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:23.986 [2024-07-12 13:38:12.510148] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:23.986 [2024-07-12 13:38:12.510175] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:23.986 pt1 00:12:23.986 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:23.986 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:23.986 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:23.986 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:23.986 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:23.986 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:23.986 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:23.986 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:23.986 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:24.245 malloc2 00:12:24.245 13:38:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:24.503 [2024-07-12 13:38:13.022596] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:24.503 [2024-07-12 13:38:13.022641] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:24.503 [2024-07-12 13:38:13.022659] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcaafb0 00:12:24.503 [2024-07-12 13:38:13.022671] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:24.503 [2024-07-12 13:38:13.024130] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:24.503 [2024-07-12 13:38:13.024157] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:24.503 pt2 00:12:24.503 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:24.503 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:24.503 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:25.070 [2024-07-12 13:38:13.531942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:25.070 [2024-07-12 13:38:13.533311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:25.070 [2024-07-12 13:38:13.533453] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcab6b0 00:12:25.070 [2024-07-12 13:38:13.533466] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:25.070 [2024-07-12 13:38:13.533665] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc0e220 00:12:25.070 [2024-07-12 13:38:13.533806] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcab6b0 00:12:25.070 [2024-07-12 13:38:13.533816] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcab6b0 00:12:25.070 [2024-07-12 13:38:13.533914] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.070 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:25.330 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:25.330 "name": "raid_bdev1", 00:12:25.330 "uuid": "7170559f-7a35-4d8c-a3d7-c22035647f94", 00:12:25.330 "strip_size_kb": 0, 00:12:25.330 "state": "online", 00:12:25.330 "raid_level": "raid1", 00:12:25.330 "superblock": true, 00:12:25.330 "num_base_bdevs": 2, 00:12:25.330 "num_base_bdevs_discovered": 2, 00:12:25.330 "num_base_bdevs_operational": 2, 00:12:25.330 "base_bdevs_list": [ 00:12:25.330 { 00:12:25.330 "name": "pt1", 00:12:25.330 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:25.330 "is_configured": true, 00:12:25.330 "data_offset": 2048, 00:12:25.330 "data_size": 63488 00:12:25.330 }, 00:12:25.330 { 00:12:25.330 "name": "pt2", 00:12:25.330 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:25.330 "is_configured": true, 00:12:25.330 "data_offset": 2048, 00:12:25.330 "data_size": 63488 00:12:25.330 } 00:12:25.330 ] 00:12:25.330 }' 00:12:25.330 13:38:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:25.330 13:38:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.265 13:38:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:26.265 13:38:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:26.265 13:38:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:26.265 13:38:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:26.265 13:38:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:26.266 13:38:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:26.266 13:38:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:26.266 13:38:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:26.525 [2024-07-12 13:38:14.980015] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:26.525 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:26.525 "name": "raid_bdev1", 00:12:26.525 "aliases": [ 00:12:26.525 "7170559f-7a35-4d8c-a3d7-c22035647f94" 00:12:26.525 ], 00:12:26.525 "product_name": "Raid Volume", 00:12:26.525 "block_size": 512, 00:12:26.525 "num_blocks": 63488, 00:12:26.525 "uuid": "7170559f-7a35-4d8c-a3d7-c22035647f94", 00:12:26.525 "assigned_rate_limits": { 00:12:26.525 "rw_ios_per_sec": 0, 00:12:26.525 "rw_mbytes_per_sec": 0, 00:12:26.525 "r_mbytes_per_sec": 0, 00:12:26.525 "w_mbytes_per_sec": 0 00:12:26.525 }, 00:12:26.525 "claimed": false, 00:12:26.525 "zoned": false, 00:12:26.525 "supported_io_types": { 00:12:26.525 "read": true, 00:12:26.525 "write": true, 00:12:26.525 "unmap": false, 00:12:26.525 "flush": false, 00:12:26.525 "reset": true, 00:12:26.525 "nvme_admin": false, 00:12:26.525 "nvme_io": false, 00:12:26.525 "nvme_io_md": false, 00:12:26.525 "write_zeroes": true, 00:12:26.525 "zcopy": false, 00:12:26.525 "get_zone_info": false, 00:12:26.525 "zone_management": false, 00:12:26.525 "zone_append": false, 00:12:26.525 "compare": false, 00:12:26.525 "compare_and_write": false, 00:12:26.525 "abort": false, 00:12:26.525 "seek_hole": false, 00:12:26.525 "seek_data": false, 00:12:26.525 "copy": false, 00:12:26.525 "nvme_iov_md": false 00:12:26.525 }, 00:12:26.525 "memory_domains": [ 00:12:26.525 { 00:12:26.525 "dma_device_id": "system", 00:12:26.525 "dma_device_type": 1 00:12:26.525 }, 00:12:26.525 { 00:12:26.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:26.525 "dma_device_type": 2 00:12:26.525 }, 00:12:26.525 { 00:12:26.525 "dma_device_id": "system", 00:12:26.525 "dma_device_type": 1 00:12:26.525 }, 00:12:26.525 { 00:12:26.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:26.525 "dma_device_type": 2 00:12:26.525 } 00:12:26.525 ], 00:12:26.525 "driver_specific": { 00:12:26.525 "raid": { 00:12:26.525 "uuid": "7170559f-7a35-4d8c-a3d7-c22035647f94", 00:12:26.525 "strip_size_kb": 0, 00:12:26.525 "state": "online", 00:12:26.525 "raid_level": "raid1", 00:12:26.525 "superblock": true, 00:12:26.525 "num_base_bdevs": 2, 00:12:26.525 "num_base_bdevs_discovered": 2, 00:12:26.525 "num_base_bdevs_operational": 2, 00:12:26.525 "base_bdevs_list": [ 00:12:26.525 { 00:12:26.525 "name": "pt1", 00:12:26.525 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:26.525 "is_configured": true, 00:12:26.525 "data_offset": 2048, 00:12:26.525 "data_size": 63488 00:12:26.525 }, 00:12:26.525 { 00:12:26.525 "name": "pt2", 00:12:26.525 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:26.525 "is_configured": true, 00:12:26.525 "data_offset": 2048, 00:12:26.525 "data_size": 63488 00:12:26.525 } 00:12:26.525 ] 00:12:26.525 } 00:12:26.525 } 00:12:26.525 }' 00:12:26.525 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:26.525 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:26.525 pt2' 00:12:26.526 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:26.526 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:26.526 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:27.094 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:27.094 "name": "pt1", 00:12:27.094 "aliases": [ 00:12:27.094 "00000000-0000-0000-0000-000000000001" 00:12:27.094 ], 00:12:27.094 "product_name": "passthru", 00:12:27.094 "block_size": 512, 00:12:27.094 "num_blocks": 65536, 00:12:27.094 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:27.094 "assigned_rate_limits": { 00:12:27.094 "rw_ios_per_sec": 0, 00:12:27.094 "rw_mbytes_per_sec": 0, 00:12:27.094 "r_mbytes_per_sec": 0, 00:12:27.094 "w_mbytes_per_sec": 0 00:12:27.094 }, 00:12:27.094 "claimed": true, 00:12:27.094 "claim_type": "exclusive_write", 00:12:27.094 "zoned": false, 00:12:27.094 "supported_io_types": { 00:12:27.094 "read": true, 00:12:27.094 "write": true, 00:12:27.094 "unmap": true, 00:12:27.094 "flush": true, 00:12:27.094 "reset": true, 00:12:27.094 "nvme_admin": false, 00:12:27.094 "nvme_io": false, 00:12:27.094 "nvme_io_md": false, 00:12:27.094 "write_zeroes": true, 00:12:27.094 "zcopy": true, 00:12:27.094 "get_zone_info": false, 00:12:27.094 "zone_management": false, 00:12:27.094 "zone_append": false, 00:12:27.094 "compare": false, 00:12:27.094 "compare_and_write": false, 00:12:27.094 "abort": true, 00:12:27.094 "seek_hole": false, 00:12:27.094 "seek_data": false, 00:12:27.094 "copy": true, 00:12:27.094 "nvme_iov_md": false 00:12:27.094 }, 00:12:27.094 "memory_domains": [ 00:12:27.094 { 00:12:27.094 "dma_device_id": "system", 00:12:27.094 "dma_device_type": 1 00:12:27.094 }, 00:12:27.094 { 00:12:27.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.094 "dma_device_type": 2 00:12:27.094 } 00:12:27.094 ], 00:12:27.094 "driver_specific": { 00:12:27.094 "passthru": { 00:12:27.094 "name": "pt1", 00:12:27.094 "base_bdev_name": "malloc1" 00:12:27.094 } 00:12:27.094 } 00:12:27.094 }' 00:12:27.094 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:27.094 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:27.353 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:27.353 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:27.353 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:27.353 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:27.353 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:27.611 13:38:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:27.611 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:27.611 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:27.611 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:27.611 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:27.611 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:27.611 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:27.611 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:28.183 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:28.183 "name": "pt2", 00:12:28.183 "aliases": [ 00:12:28.184 "00000000-0000-0000-0000-000000000002" 00:12:28.184 ], 00:12:28.184 "product_name": "passthru", 00:12:28.184 "block_size": 512, 00:12:28.184 "num_blocks": 65536, 00:12:28.184 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:28.184 "assigned_rate_limits": { 00:12:28.184 "rw_ios_per_sec": 0, 00:12:28.184 "rw_mbytes_per_sec": 0, 00:12:28.184 "r_mbytes_per_sec": 0, 00:12:28.184 "w_mbytes_per_sec": 0 00:12:28.184 }, 00:12:28.184 "claimed": true, 00:12:28.184 "claim_type": "exclusive_write", 00:12:28.184 "zoned": false, 00:12:28.184 "supported_io_types": { 00:12:28.184 "read": true, 00:12:28.184 "write": true, 00:12:28.184 "unmap": true, 00:12:28.184 "flush": true, 00:12:28.184 "reset": true, 00:12:28.184 "nvme_admin": false, 00:12:28.184 "nvme_io": false, 00:12:28.184 "nvme_io_md": false, 00:12:28.184 "write_zeroes": true, 00:12:28.184 "zcopy": true, 00:12:28.184 "get_zone_info": false, 00:12:28.184 "zone_management": false, 00:12:28.184 "zone_append": false, 00:12:28.184 "compare": false, 00:12:28.184 "compare_and_write": false, 00:12:28.184 "abort": true, 00:12:28.184 "seek_hole": false, 00:12:28.184 "seek_data": false, 00:12:28.185 "copy": true, 00:12:28.185 "nvme_iov_md": false 00:12:28.185 }, 00:12:28.185 "memory_domains": [ 00:12:28.185 { 00:12:28.185 "dma_device_id": "system", 00:12:28.185 "dma_device_type": 1 00:12:28.185 }, 00:12:28.185 { 00:12:28.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:28.185 "dma_device_type": 2 00:12:28.185 } 00:12:28.185 ], 00:12:28.185 "driver_specific": { 00:12:28.185 "passthru": { 00:12:28.185 "name": "pt2", 00:12:28.185 "base_bdev_name": "malloc2" 00:12:28.185 } 00:12:28.185 } 00:12:28.185 }' 00:12:28.185 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:28.185 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:28.447 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:28.447 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:28.447 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:28.447 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:28.447 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:28.447 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:28.447 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:28.447 13:38:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:28.706 13:38:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:28.706 13:38:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:28.706 13:38:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:28.706 13:38:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:28.965 [2024-07-12 13:38:17.314223] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:28.965 13:38:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7170559f-7a35-4d8c-a3d7-c22035647f94 00:12:28.965 13:38:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 7170559f-7a35-4d8c-a3d7-c22035647f94 ']' 00:12:28.965 13:38:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:29.533 [2024-07-12 13:38:17.815295] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:29.533 [2024-07-12 13:38:17.815320] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:29.533 [2024-07-12 13:38:17.815376] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:29.533 [2024-07-12 13:38:17.815429] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:29.533 [2024-07-12 13:38:17.815441] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcab6b0 name raid_bdev1, state offline 00:12:29.533 13:38:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.533 13:38:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:29.533 13:38:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:29.533 13:38:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:29.533 13:38:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:29.533 13:38:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:29.792 13:38:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:29.792 13:38:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:30.051 13:38:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:30.051 13:38:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:30.620 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:30.879 [2024-07-12 13:38:19.327210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:30.879 [2024-07-12 13:38:19.328605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:30.879 [2024-07-12 13:38:19.328662] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:30.879 [2024-07-12 13:38:19.328701] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:30.879 [2024-07-12 13:38:19.328720] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:30.879 [2024-07-12 13:38:19.328729] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc0f8d0 name raid_bdev1, state configuring 00:12:30.879 request: 00:12:30.879 { 00:12:30.880 "name": "raid_bdev1", 00:12:30.880 "raid_level": "raid1", 00:12:30.880 "base_bdevs": [ 00:12:30.880 "malloc1", 00:12:30.880 "malloc2" 00:12:30.880 ], 00:12:30.880 "superblock": false, 00:12:30.880 "method": "bdev_raid_create", 00:12:30.880 "req_id": 1 00:12:30.880 } 00:12:30.880 Got JSON-RPC error response 00:12:30.880 response: 00:12:30.880 { 00:12:30.880 "code": -17, 00:12:30.880 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:30.880 } 00:12:30.880 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:30.880 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:30.880 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:30.880 13:38:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:30.880 13:38:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.880 13:38:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:31.138 13:38:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:31.139 13:38:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:31.139 13:38:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:31.706 [2024-07-12 13:38:20.133275] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:31.706 [2024-07-12 13:38:20.133330] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:31.706 [2024-07-12 13:38:20.133349] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc10040 00:12:31.706 [2024-07-12 13:38:20.133361] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:31.706 [2024-07-12 13:38:20.134984] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:31.706 [2024-07-12 13:38:20.135011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:31.706 [2024-07-12 13:38:20.135080] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:31.706 [2024-07-12 13:38:20.135104] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:31.706 pt1 00:12:31.706 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:12:31.706 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:31.706 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:31.706 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:31.707 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:31.707 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:31.707 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.707 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.707 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.707 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.707 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.707 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:31.965 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:31.965 "name": "raid_bdev1", 00:12:31.965 "uuid": "7170559f-7a35-4d8c-a3d7-c22035647f94", 00:12:31.965 "strip_size_kb": 0, 00:12:31.965 "state": "configuring", 00:12:31.965 "raid_level": "raid1", 00:12:31.965 "superblock": true, 00:12:31.965 "num_base_bdevs": 2, 00:12:31.965 "num_base_bdevs_discovered": 1, 00:12:31.965 "num_base_bdevs_operational": 2, 00:12:31.965 "base_bdevs_list": [ 00:12:31.965 { 00:12:31.965 "name": "pt1", 00:12:31.965 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:31.965 "is_configured": true, 00:12:31.965 "data_offset": 2048, 00:12:31.965 "data_size": 63488 00:12:31.965 }, 00:12:31.965 { 00:12:31.965 "name": null, 00:12:31.965 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:31.965 "is_configured": false, 00:12:31.965 "data_offset": 2048, 00:12:31.965 "data_size": 63488 00:12:31.965 } 00:12:31.965 ] 00:12:31.965 }' 00:12:31.965 13:38:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:31.965 13:38:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:32.902 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:32.902 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:32.902 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:32.902 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:33.161 [2024-07-12 13:38:21.508941] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:33.161 [2024-07-12 13:38:21.508992] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:33.161 [2024-07-12 13:38:21.509012] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc0ebf0 00:12:33.161 [2024-07-12 13:38:21.509024] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:33.161 [2024-07-12 13:38:21.509359] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:33.161 [2024-07-12 13:38:21.509376] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:33.161 [2024-07-12 13:38:21.509440] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:33.161 [2024-07-12 13:38:21.509459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:33.161 [2024-07-12 13:38:21.509555] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc0b970 00:12:33.161 [2024-07-12 13:38:21.509565] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:33.161 [2024-07-12 13:38:21.509734] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcb0690 00:12:33.161 [2024-07-12 13:38:21.509857] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc0b970 00:12:33.161 [2024-07-12 13:38:21.509867] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc0b970 00:12:33.161 [2024-07-12 13:38:21.509974] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:33.161 pt2 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:33.161 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.419 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.419 "name": "raid_bdev1", 00:12:33.419 "uuid": "7170559f-7a35-4d8c-a3d7-c22035647f94", 00:12:33.419 "strip_size_kb": 0, 00:12:33.419 "state": "online", 00:12:33.419 "raid_level": "raid1", 00:12:33.419 "superblock": true, 00:12:33.419 "num_base_bdevs": 2, 00:12:33.419 "num_base_bdevs_discovered": 2, 00:12:33.419 "num_base_bdevs_operational": 2, 00:12:33.419 "base_bdevs_list": [ 00:12:33.419 { 00:12:33.419 "name": "pt1", 00:12:33.419 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:33.419 "is_configured": true, 00:12:33.419 "data_offset": 2048, 00:12:33.419 "data_size": 63488 00:12:33.419 }, 00:12:33.419 { 00:12:33.419 "name": "pt2", 00:12:33.419 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:33.419 "is_configured": true, 00:12:33.419 "data_offset": 2048, 00:12:33.419 "data_size": 63488 00:12:33.419 } 00:12:33.419 ] 00:12:33.419 }' 00:12:33.419 13:38:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.419 13:38:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.986 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:33.986 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:33.986 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:33.986 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:33.986 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:33.986 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:33.986 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:33.986 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:33.986 [2024-07-12 13:38:22.531873] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:33.986 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:33.986 "name": "raid_bdev1", 00:12:33.986 "aliases": [ 00:12:33.986 "7170559f-7a35-4d8c-a3d7-c22035647f94" 00:12:33.986 ], 00:12:33.986 "product_name": "Raid Volume", 00:12:33.986 "block_size": 512, 00:12:33.986 "num_blocks": 63488, 00:12:33.986 "uuid": "7170559f-7a35-4d8c-a3d7-c22035647f94", 00:12:33.986 "assigned_rate_limits": { 00:12:33.986 "rw_ios_per_sec": 0, 00:12:33.986 "rw_mbytes_per_sec": 0, 00:12:33.986 "r_mbytes_per_sec": 0, 00:12:33.986 "w_mbytes_per_sec": 0 00:12:33.986 }, 00:12:33.986 "claimed": false, 00:12:33.986 "zoned": false, 00:12:33.986 "supported_io_types": { 00:12:33.986 "read": true, 00:12:33.986 "write": true, 00:12:33.986 "unmap": false, 00:12:33.986 "flush": false, 00:12:33.986 "reset": true, 00:12:33.986 "nvme_admin": false, 00:12:33.986 "nvme_io": false, 00:12:33.986 "nvme_io_md": false, 00:12:33.986 "write_zeroes": true, 00:12:33.986 "zcopy": false, 00:12:33.986 "get_zone_info": false, 00:12:33.986 "zone_management": false, 00:12:33.986 "zone_append": false, 00:12:33.986 "compare": false, 00:12:33.986 "compare_and_write": false, 00:12:33.986 "abort": false, 00:12:33.987 "seek_hole": false, 00:12:33.987 "seek_data": false, 00:12:33.987 "copy": false, 00:12:33.987 "nvme_iov_md": false 00:12:33.987 }, 00:12:33.987 "memory_domains": [ 00:12:33.987 { 00:12:33.987 "dma_device_id": "system", 00:12:33.987 "dma_device_type": 1 00:12:33.987 }, 00:12:33.987 { 00:12:33.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.987 "dma_device_type": 2 00:12:33.987 }, 00:12:33.987 { 00:12:33.987 "dma_device_id": "system", 00:12:33.987 "dma_device_type": 1 00:12:33.987 }, 00:12:33.987 { 00:12:33.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.987 "dma_device_type": 2 00:12:33.987 } 00:12:33.987 ], 00:12:33.987 "driver_specific": { 00:12:33.987 "raid": { 00:12:33.987 "uuid": "7170559f-7a35-4d8c-a3d7-c22035647f94", 00:12:33.987 "strip_size_kb": 0, 00:12:33.987 "state": "online", 00:12:33.987 "raid_level": "raid1", 00:12:33.987 "superblock": true, 00:12:33.987 "num_base_bdevs": 2, 00:12:33.987 "num_base_bdevs_discovered": 2, 00:12:33.987 "num_base_bdevs_operational": 2, 00:12:33.987 "base_bdevs_list": [ 00:12:33.987 { 00:12:33.987 "name": "pt1", 00:12:33.987 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:33.987 "is_configured": true, 00:12:33.987 "data_offset": 2048, 00:12:33.987 "data_size": 63488 00:12:33.987 }, 00:12:33.987 { 00:12:33.987 "name": "pt2", 00:12:33.987 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:33.987 "is_configured": true, 00:12:33.987 "data_offset": 2048, 00:12:33.987 "data_size": 63488 00:12:33.987 } 00:12:33.987 ] 00:12:33.987 } 00:12:33.987 } 00:12:33.987 }' 00:12:33.987 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:34.246 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:34.246 pt2' 00:12:34.246 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:34.246 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:34.246 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:34.246 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:34.246 "name": "pt1", 00:12:34.246 "aliases": [ 00:12:34.246 "00000000-0000-0000-0000-000000000001" 00:12:34.246 ], 00:12:34.246 "product_name": "passthru", 00:12:34.246 "block_size": 512, 00:12:34.246 "num_blocks": 65536, 00:12:34.246 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:34.246 "assigned_rate_limits": { 00:12:34.246 "rw_ios_per_sec": 0, 00:12:34.246 "rw_mbytes_per_sec": 0, 00:12:34.246 "r_mbytes_per_sec": 0, 00:12:34.246 "w_mbytes_per_sec": 0 00:12:34.246 }, 00:12:34.246 "claimed": true, 00:12:34.246 "claim_type": "exclusive_write", 00:12:34.246 "zoned": false, 00:12:34.246 "supported_io_types": { 00:12:34.246 "read": true, 00:12:34.246 "write": true, 00:12:34.246 "unmap": true, 00:12:34.246 "flush": true, 00:12:34.246 "reset": true, 00:12:34.246 "nvme_admin": false, 00:12:34.246 "nvme_io": false, 00:12:34.246 "nvme_io_md": false, 00:12:34.246 "write_zeroes": true, 00:12:34.246 "zcopy": true, 00:12:34.246 "get_zone_info": false, 00:12:34.246 "zone_management": false, 00:12:34.246 "zone_append": false, 00:12:34.246 "compare": false, 00:12:34.246 "compare_and_write": false, 00:12:34.246 "abort": true, 00:12:34.246 "seek_hole": false, 00:12:34.246 "seek_data": false, 00:12:34.246 "copy": true, 00:12:34.246 "nvme_iov_md": false 00:12:34.246 }, 00:12:34.246 "memory_domains": [ 00:12:34.246 { 00:12:34.246 "dma_device_id": "system", 00:12:34.246 "dma_device_type": 1 00:12:34.246 }, 00:12:34.246 { 00:12:34.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.246 "dma_device_type": 2 00:12:34.246 } 00:12:34.246 ], 00:12:34.246 "driver_specific": { 00:12:34.246 "passthru": { 00:12:34.246 "name": "pt1", 00:12:34.246 "base_bdev_name": "malloc1" 00:12:34.246 } 00:12:34.246 } 00:12:34.246 }' 00:12:34.246 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.504 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.504 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:34.504 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:34.504 13:38:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:34.504 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:34.504 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:34.504 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:34.762 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:34.762 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:34.762 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:34.762 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:34.762 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:34.762 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:34.762 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:35.019 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:35.019 "name": "pt2", 00:12:35.019 "aliases": [ 00:12:35.019 "00000000-0000-0000-0000-000000000002" 00:12:35.019 ], 00:12:35.019 "product_name": "passthru", 00:12:35.019 "block_size": 512, 00:12:35.019 "num_blocks": 65536, 00:12:35.019 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:35.019 "assigned_rate_limits": { 00:12:35.019 "rw_ios_per_sec": 0, 00:12:35.019 "rw_mbytes_per_sec": 0, 00:12:35.019 "r_mbytes_per_sec": 0, 00:12:35.019 "w_mbytes_per_sec": 0 00:12:35.019 }, 00:12:35.019 "claimed": true, 00:12:35.019 "claim_type": "exclusive_write", 00:12:35.019 "zoned": false, 00:12:35.019 "supported_io_types": { 00:12:35.019 "read": true, 00:12:35.019 "write": true, 00:12:35.019 "unmap": true, 00:12:35.019 "flush": true, 00:12:35.019 "reset": true, 00:12:35.019 "nvme_admin": false, 00:12:35.019 "nvme_io": false, 00:12:35.019 "nvme_io_md": false, 00:12:35.019 "write_zeroes": true, 00:12:35.019 "zcopy": true, 00:12:35.019 "get_zone_info": false, 00:12:35.019 "zone_management": false, 00:12:35.019 "zone_append": false, 00:12:35.019 "compare": false, 00:12:35.019 "compare_and_write": false, 00:12:35.019 "abort": true, 00:12:35.019 "seek_hole": false, 00:12:35.019 "seek_data": false, 00:12:35.019 "copy": true, 00:12:35.019 "nvme_iov_md": false 00:12:35.019 }, 00:12:35.019 "memory_domains": [ 00:12:35.019 { 00:12:35.019 "dma_device_id": "system", 00:12:35.019 "dma_device_type": 1 00:12:35.019 }, 00:12:35.019 { 00:12:35.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.019 "dma_device_type": 2 00:12:35.019 } 00:12:35.019 ], 00:12:35.019 "driver_specific": { 00:12:35.019 "passthru": { 00:12:35.019 "name": "pt2", 00:12:35.019 "base_bdev_name": "malloc2" 00:12:35.019 } 00:12:35.019 } 00:12:35.019 }' 00:12:35.019 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.019 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.019 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:35.019 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.019 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.277 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:35.277 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.277 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.277 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:35.277 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.277 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.277 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:35.277 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:35.277 13:38:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:35.535 [2024-07-12 13:38:24.080002] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:35.535 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 7170559f-7a35-4d8c-a3d7-c22035647f94 '!=' 7170559f-7a35-4d8c-a3d7-c22035647f94 ']' 00:12:35.535 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:12:35.535 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:35.535 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:35.535 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:36.100 [2024-07-12 13:38:24.581114] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.100 13:38:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:36.664 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.664 "name": "raid_bdev1", 00:12:36.664 "uuid": "7170559f-7a35-4d8c-a3d7-c22035647f94", 00:12:36.664 "strip_size_kb": 0, 00:12:36.664 "state": "online", 00:12:36.664 "raid_level": "raid1", 00:12:36.664 "superblock": true, 00:12:36.664 "num_base_bdevs": 2, 00:12:36.664 "num_base_bdevs_discovered": 1, 00:12:36.665 "num_base_bdevs_operational": 1, 00:12:36.665 "base_bdevs_list": [ 00:12:36.665 { 00:12:36.665 "name": null, 00:12:36.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.665 "is_configured": false, 00:12:36.665 "data_offset": 2048, 00:12:36.665 "data_size": 63488 00:12:36.665 }, 00:12:36.665 { 00:12:36.665 "name": "pt2", 00:12:36.665 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:36.665 "is_configured": true, 00:12:36.665 "data_offset": 2048, 00:12:36.665 "data_size": 63488 00:12:36.665 } 00:12:36.665 ] 00:12:36.665 }' 00:12:36.665 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.665 13:38:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:37.232 13:38:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:37.490 [2024-07-12 13:38:25.984789] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:37.490 [2024-07-12 13:38:25.984817] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:37.490 [2024-07-12 13:38:25.984874] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:37.490 [2024-07-12 13:38:25.984917] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:37.490 [2024-07-12 13:38:25.984939] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc0b970 name raid_bdev1, state offline 00:12:37.490 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.490 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:12:37.749 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:12:37.749 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:12:37.749 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:12:37.749 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:37.749 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:38.006 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:12:38.006 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:38.006 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:12:38.006 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:12:38.006 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:12:38.006 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:38.264 [2024-07-12 13:38:26.602404] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:38.264 [2024-07-12 13:38:26.602449] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:38.264 [2024-07-12 13:38:26.602466] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcaff00 00:12:38.264 [2024-07-12 13:38:26.602479] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:38.264 [2024-07-12 13:38:26.604081] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:38.264 [2024-07-12 13:38:26.604108] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:38.264 [2024-07-12 13:38:26.604169] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:38.264 [2024-07-12 13:38:26.604194] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:38.264 [2024-07-12 13:38:26.604274] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcad970 00:12:38.264 [2024-07-12 13:38:26.604289] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:38.264 [2024-07-12 13:38:26.604462] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcad240 00:12:38.264 [2024-07-12 13:38:26.604581] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcad970 00:12:38.264 [2024-07-12 13:38:26.604591] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcad970 00:12:38.264 [2024-07-12 13:38:26.604685] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:38.264 pt2 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.264 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:38.522 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.522 "name": "raid_bdev1", 00:12:38.522 "uuid": "7170559f-7a35-4d8c-a3d7-c22035647f94", 00:12:38.522 "strip_size_kb": 0, 00:12:38.522 "state": "online", 00:12:38.522 "raid_level": "raid1", 00:12:38.522 "superblock": true, 00:12:38.522 "num_base_bdevs": 2, 00:12:38.522 "num_base_bdevs_discovered": 1, 00:12:38.522 "num_base_bdevs_operational": 1, 00:12:38.522 "base_bdevs_list": [ 00:12:38.522 { 00:12:38.522 "name": null, 00:12:38.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.522 "is_configured": false, 00:12:38.522 "data_offset": 2048, 00:12:38.522 "data_size": 63488 00:12:38.522 }, 00:12:38.522 { 00:12:38.522 "name": "pt2", 00:12:38.522 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:38.522 "is_configured": true, 00:12:38.522 "data_offset": 2048, 00:12:38.522 "data_size": 63488 00:12:38.522 } 00:12:38.522 ] 00:12:38.522 }' 00:12:38.522 13:38:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.522 13:38:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.458 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:39.458 [2024-07-12 13:38:27.949983] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:39.458 [2024-07-12 13:38:27.950009] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:39.458 [2024-07-12 13:38:27.950067] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:39.458 [2024-07-12 13:38:27.950111] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:39.458 [2024-07-12 13:38:27.950123] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcad970 name raid_bdev1, state offline 00:12:39.458 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.458 13:38:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:12:39.717 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:12:39.717 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:12:39.717 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:12:39.717 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:39.976 [2024-07-12 13:38:28.427220] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:39.976 [2024-07-12 13:38:28.427270] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:39.976 [2024-07-12 13:38:28.427290] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc0c230 00:12:39.976 [2024-07-12 13:38:28.427303] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:39.976 [2024-07-12 13:38:28.428959] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:39.976 [2024-07-12 13:38:28.428986] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:39.976 [2024-07-12 13:38:28.429055] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:39.976 [2024-07-12 13:38:28.429080] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:39.976 [2024-07-12 13:38:28.429177] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:12:39.976 [2024-07-12 13:38:28.429190] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:39.976 [2024-07-12 13:38:28.429202] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcae5c0 name raid_bdev1, state configuring 00:12:39.976 [2024-07-12 13:38:28.429225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:39.976 [2024-07-12 13:38:28.429282] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcb0130 00:12:39.976 [2024-07-12 13:38:28.429292] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:39.976 [2024-07-12 13:38:28.429458] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcae650 00:12:39.976 [2024-07-12 13:38:28.429576] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcb0130 00:12:39.976 [2024-07-12 13:38:28.429586] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcb0130 00:12:39.976 [2024-07-12 13:38:28.429679] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:39.976 pt1 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.976 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:40.235 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.235 "name": "raid_bdev1", 00:12:40.235 "uuid": "7170559f-7a35-4d8c-a3d7-c22035647f94", 00:12:40.235 "strip_size_kb": 0, 00:12:40.235 "state": "online", 00:12:40.235 "raid_level": "raid1", 00:12:40.235 "superblock": true, 00:12:40.235 "num_base_bdevs": 2, 00:12:40.235 "num_base_bdevs_discovered": 1, 00:12:40.235 "num_base_bdevs_operational": 1, 00:12:40.235 "base_bdevs_list": [ 00:12:40.235 { 00:12:40.235 "name": null, 00:12:40.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:40.235 "is_configured": false, 00:12:40.235 "data_offset": 2048, 00:12:40.235 "data_size": 63488 00:12:40.235 }, 00:12:40.235 { 00:12:40.235 "name": "pt2", 00:12:40.235 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:40.235 "is_configured": true, 00:12:40.235 "data_offset": 2048, 00:12:40.235 "data_size": 63488 00:12:40.235 } 00:12:40.235 ] 00:12:40.235 }' 00:12:40.235 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.235 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.170 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:41.170 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:12:41.430 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:12:41.430 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:41.430 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:12:41.998 [2024-07-12 13:38:30.280363] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 7170559f-7a35-4d8c-a3d7-c22035647f94 '!=' 7170559f-7a35-4d8c-a3d7-c22035647f94 ']' 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 442961 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 442961 ']' 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 442961 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 442961 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 442961' 00:12:41.998 killing process with pid 442961 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 442961 00:12:41.998 [2024-07-12 13:38:30.365760] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:41.998 [2024-07-12 13:38:30.365814] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:41.998 [2024-07-12 13:38:30.365855] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:41.998 [2024-07-12 13:38:30.365866] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcb0130 name raid_bdev1, state offline 00:12:41.998 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 442961 00:12:41.998 [2024-07-12 13:38:30.382208] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:42.257 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:42.257 00:12:42.257 real 0m19.656s 00:12:42.257 user 0m35.889s 00:12:42.257 sys 0m3.346s 00:12:42.257 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:42.257 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:42.257 ************************************ 00:12:42.257 END TEST raid_superblock_test 00:12:42.257 ************************************ 00:12:42.257 13:38:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:42.257 13:38:30 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:12:42.257 13:38:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:42.257 13:38:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:42.257 13:38:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:42.257 ************************************ 00:12:42.257 START TEST raid_read_error_test 00:12:42.257 ************************************ 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.3rif5PvZGK 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=445850 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 445850 /var/tmp/spdk-raid.sock 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 445850 ']' 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:42.257 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:42.257 13:38:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:42.257 [2024-07-12 13:38:30.740754] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:12:42.257 [2024-07-12 13:38:30.740822] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid445850 ] 00:12:42.516 [2024-07-12 13:38:30.859605] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:42.516 [2024-07-12 13:38:30.966238] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:42.516 [2024-07-12 13:38:31.033547] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:42.516 [2024-07-12 13:38:31.033586] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:43.453 13:38:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:43.453 13:38:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:43.453 13:38:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:43.453 13:38:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:43.453 BaseBdev1_malloc 00:12:43.453 13:38:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:43.712 true 00:12:43.712 13:38:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:43.972 [2024-07-12 13:38:32.384118] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:43.972 [2024-07-12 13:38:32.384163] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:43.972 [2024-07-12 13:38:32.384185] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9aca10 00:12:43.972 [2024-07-12 13:38:32.384197] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:43.972 [2024-07-12 13:38:32.386085] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:43.972 [2024-07-12 13:38:32.386114] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:43.972 BaseBdev1 00:12:43.972 13:38:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:43.972 13:38:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:44.232 BaseBdev2_malloc 00:12:44.232 13:38:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:44.491 true 00:12:44.491 13:38:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:44.750 [2024-07-12 13:38:33.123849] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:44.750 [2024-07-12 13:38:33.123893] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:44.750 [2024-07-12 13:38:33.123914] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9b1250 00:12:44.750 [2024-07-12 13:38:33.123932] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:44.750 [2024-07-12 13:38:33.125507] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:44.750 [2024-07-12 13:38:33.125535] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:44.750 BaseBdev2 00:12:44.750 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:45.008 [2024-07-12 13:38:33.364511] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:45.008 [2024-07-12 13:38:33.365855] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:45.009 [2024-07-12 13:38:33.366055] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9b2c60 00:12:45.009 [2024-07-12 13:38:33.366070] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:45.009 [2024-07-12 13:38:33.366264] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x81a640 00:12:45.009 [2024-07-12 13:38:33.366416] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9b2c60 00:12:45.009 [2024-07-12 13:38:33.366426] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9b2c60 00:12:45.009 [2024-07-12 13:38:33.366537] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.009 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:45.267 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:45.267 "name": "raid_bdev1", 00:12:45.267 "uuid": "2ef580ee-25d7-4a49-8ff0-eb3b0065a0e1", 00:12:45.267 "strip_size_kb": 0, 00:12:45.267 "state": "online", 00:12:45.267 "raid_level": "raid1", 00:12:45.267 "superblock": true, 00:12:45.267 "num_base_bdevs": 2, 00:12:45.267 "num_base_bdevs_discovered": 2, 00:12:45.267 "num_base_bdevs_operational": 2, 00:12:45.267 "base_bdevs_list": [ 00:12:45.267 { 00:12:45.267 "name": "BaseBdev1", 00:12:45.267 "uuid": "742a0b12-20b8-5d0c-b90d-bd38e421aa49", 00:12:45.267 "is_configured": true, 00:12:45.267 "data_offset": 2048, 00:12:45.267 "data_size": 63488 00:12:45.267 }, 00:12:45.267 { 00:12:45.267 "name": "BaseBdev2", 00:12:45.267 "uuid": "12f36384-429e-5bdf-a9f4-47ff59e301be", 00:12:45.267 "is_configured": true, 00:12:45.267 "data_offset": 2048, 00:12:45.267 "data_size": 63488 00:12:45.267 } 00:12:45.267 ] 00:12:45.267 }' 00:12:45.267 13:38:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:45.267 13:38:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:45.834 13:38:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:45.834 13:38:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:45.834 [2024-07-12 13:38:34.307287] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ae5b0 00:12:46.772 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:47.032 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:47.033 "name": "raid_bdev1", 00:12:47.033 "uuid": "2ef580ee-25d7-4a49-8ff0-eb3b0065a0e1", 00:12:47.033 "strip_size_kb": 0, 00:12:47.033 "state": "online", 00:12:47.033 "raid_level": "raid1", 00:12:47.033 "superblock": true, 00:12:47.033 "num_base_bdevs": 2, 00:12:47.033 "num_base_bdevs_discovered": 2, 00:12:47.033 "num_base_bdevs_operational": 2, 00:12:47.033 "base_bdevs_list": [ 00:12:47.033 { 00:12:47.033 "name": "BaseBdev1", 00:12:47.033 "uuid": "742a0b12-20b8-5d0c-b90d-bd38e421aa49", 00:12:47.033 "is_configured": true, 00:12:47.033 "data_offset": 2048, 00:12:47.033 "data_size": 63488 00:12:47.033 }, 00:12:47.033 { 00:12:47.033 "name": "BaseBdev2", 00:12:47.033 "uuid": "12f36384-429e-5bdf-a9f4-47ff59e301be", 00:12:47.033 "is_configured": true, 00:12:47.033 "data_offset": 2048, 00:12:47.033 "data_size": 63488 00:12:47.033 } 00:12:47.033 ] 00:12:47.033 }' 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:47.033 13:38:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.601 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:47.860 [2024-07-12 13:38:36.397532] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:47.860 [2024-07-12 13:38:36.397570] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:47.860 [2024-07-12 13:38:36.400701] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:47.860 [2024-07-12 13:38:36.400747] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:47.860 [2024-07-12 13:38:36.400825] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:47.860 [2024-07-12 13:38:36.400837] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9b2c60 name raid_bdev1, state offline 00:12:47.860 0 00:12:47.860 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 445850 00:12:47.860 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 445850 ']' 00:12:47.860 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 445850 00:12:47.860 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:47.860 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:47.860 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 445850 00:12:48.120 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:48.120 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:48.120 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 445850' 00:12:48.120 killing process with pid 445850 00:12:48.120 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 445850 00:12:48.120 [2024-07-12 13:38:36.463704] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:48.120 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 445850 00:12:48.120 [2024-07-12 13:38:36.476207] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:48.380 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.3rif5PvZGK 00:12:48.380 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:48.380 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:48.380 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:48.380 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:48.380 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:48.380 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:48.380 13:38:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:48.380 00:12:48.380 real 0m6.055s 00:12:48.380 user 0m9.372s 00:12:48.380 sys 0m1.082s 00:12:48.380 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:48.380 13:38:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:48.380 ************************************ 00:12:48.380 END TEST raid_read_error_test 00:12:48.380 ************************************ 00:12:48.380 13:38:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:48.380 13:38:36 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:12:48.380 13:38:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:48.380 13:38:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:48.380 13:38:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:48.380 ************************************ 00:12:48.380 START TEST raid_write_error_test 00:12:48.380 ************************************ 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.9s29WTdz09 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=446722 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 446722 /var/tmp/spdk-raid.sock 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 446722 ']' 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:48.380 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:48.380 13:38:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:48.380 [2024-07-12 13:38:36.879820] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:12:48.380 [2024-07-12 13:38:36.879892] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid446722 ] 00:12:48.640 [2024-07-12 13:38:37.012629] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:48.640 [2024-07-12 13:38:37.114241] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:48.640 [2024-07-12 13:38:37.176874] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:48.640 [2024-07-12 13:38:37.176919] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:49.577 13:38:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:49.577 13:38:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:49.577 13:38:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:49.577 13:38:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:49.577 BaseBdev1_malloc 00:12:49.577 13:38:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:49.837 true 00:12:49.837 13:38:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:50.097 [2024-07-12 13:38:38.539563] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:50.097 [2024-07-12 13:38:38.539609] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:50.097 [2024-07-12 13:38:38.539628] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x220ea10 00:12:50.097 [2024-07-12 13:38:38.539641] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:50.097 [2024-07-12 13:38:38.541331] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:50.097 [2024-07-12 13:38:38.541361] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:50.097 BaseBdev1 00:12:50.097 13:38:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:50.097 13:38:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:50.357 BaseBdev2_malloc 00:12:50.357 13:38:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:50.616 true 00:12:50.616 13:38:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:50.876 [2024-07-12 13:38:39.222057] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:50.876 [2024-07-12 13:38:39.222103] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:50.876 [2024-07-12 13:38:39.222124] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2213250 00:12:50.876 [2024-07-12 13:38:39.222137] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:50.876 [2024-07-12 13:38:39.223671] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:50.876 [2024-07-12 13:38:39.223700] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:50.876 BaseBdev2 00:12:50.876 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:51.136 [2024-07-12 13:38:39.470729] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:51.136 [2024-07-12 13:38:39.472193] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:51.136 [2024-07-12 13:38:39.472381] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2214c60 00:12:51.136 [2024-07-12 13:38:39.472394] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:51.136 [2024-07-12 13:38:39.472587] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207c640 00:12:51.136 [2024-07-12 13:38:39.472735] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2214c60 00:12:51.136 [2024-07-12 13:38:39.472746] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2214c60 00:12:51.136 [2024-07-12 13:38:39.472853] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.136 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:51.396 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.396 "name": "raid_bdev1", 00:12:51.396 "uuid": "1d4476a1-daf6-4a86-a3b6-767e7df1f9a8", 00:12:51.396 "strip_size_kb": 0, 00:12:51.396 "state": "online", 00:12:51.396 "raid_level": "raid1", 00:12:51.396 "superblock": true, 00:12:51.396 "num_base_bdevs": 2, 00:12:51.396 "num_base_bdevs_discovered": 2, 00:12:51.396 "num_base_bdevs_operational": 2, 00:12:51.396 "base_bdevs_list": [ 00:12:51.396 { 00:12:51.396 "name": "BaseBdev1", 00:12:51.396 "uuid": "20e4bd6d-805b-5209-9664-4271d2b6413e", 00:12:51.396 "is_configured": true, 00:12:51.396 "data_offset": 2048, 00:12:51.396 "data_size": 63488 00:12:51.396 }, 00:12:51.396 { 00:12:51.396 "name": "BaseBdev2", 00:12:51.396 "uuid": "d26b9d7d-7fdd-5791-9e61-a8d7a0458604", 00:12:51.396 "is_configured": true, 00:12:51.396 "data_offset": 2048, 00:12:51.396 "data_size": 63488 00:12:51.396 } 00:12:51.396 ] 00:12:51.396 }' 00:12:51.396 13:38:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.396 13:38:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:51.965 13:38:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:51.965 13:38:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:51.965 [2024-07-12 13:38:40.485728] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22105b0 00:12:52.903 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:53.162 [2024-07-12 13:38:41.605896] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:12:53.162 [2024-07-12 13:38:41.605966] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:53.162 [2024-07-12 13:38:41.606143] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x22105b0 00:12:53.162 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:53.162 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:53.162 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:12:53.162 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:12:53.162 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:53.162 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:53.162 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:53.162 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:53.163 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:53.163 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:53.163 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.163 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.163 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.163 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.163 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.163 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:53.422 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.422 "name": "raid_bdev1", 00:12:53.422 "uuid": "1d4476a1-daf6-4a86-a3b6-767e7df1f9a8", 00:12:53.422 "strip_size_kb": 0, 00:12:53.422 "state": "online", 00:12:53.422 "raid_level": "raid1", 00:12:53.422 "superblock": true, 00:12:53.422 "num_base_bdevs": 2, 00:12:53.422 "num_base_bdevs_discovered": 1, 00:12:53.422 "num_base_bdevs_operational": 1, 00:12:53.422 "base_bdevs_list": [ 00:12:53.422 { 00:12:53.422 "name": null, 00:12:53.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.422 "is_configured": false, 00:12:53.422 "data_offset": 2048, 00:12:53.422 "data_size": 63488 00:12:53.422 }, 00:12:53.422 { 00:12:53.422 "name": "BaseBdev2", 00:12:53.422 "uuid": "d26b9d7d-7fdd-5791-9e61-a8d7a0458604", 00:12:53.422 "is_configured": true, 00:12:53.422 "data_offset": 2048, 00:12:53.422 "data_size": 63488 00:12:53.422 } 00:12:53.422 ] 00:12:53.422 }' 00:12:53.422 13:38:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.422 13:38:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.990 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:54.250 [2024-07-12 13:38:42.689783] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:54.250 [2024-07-12 13:38:42.689822] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:54.250 [2024-07-12 13:38:42.692976] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:54.250 [2024-07-12 13:38:42.693004] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:54.250 [2024-07-12 13:38:42.693056] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:54.250 [2024-07-12 13:38:42.693068] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2214c60 name raid_bdev1, state offline 00:12:54.250 0 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 446722 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 446722 ']' 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 446722 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 446722 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 446722' 00:12:54.250 killing process with pid 446722 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 446722 00:12:54.250 [2024-07-12 13:38:42.774509] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:54.250 13:38:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 446722 00:12:54.250 [2024-07-12 13:38:42.785181] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:54.509 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.9s29WTdz09 00:12:54.509 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:54.509 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:54.509 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:54.509 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:54.509 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:54.509 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:54.509 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:54.509 00:12:54.509 real 0m6.221s 00:12:54.509 user 0m9.743s 00:12:54.509 sys 0m1.067s 00:12:54.509 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:54.509 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.509 ************************************ 00:12:54.509 END TEST raid_write_error_test 00:12:54.509 ************************************ 00:12:54.509 13:38:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:54.509 13:38:43 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:12:54.509 13:38:43 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:54.509 13:38:43 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:12:54.509 13:38:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:54.509 13:38:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:54.509 13:38:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:54.768 ************************************ 00:12:54.768 START TEST raid_state_function_test 00:12:54.768 ************************************ 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=447686 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 447686' 00:12:54.768 Process raid pid: 447686 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 447686 /var/tmp/spdk-raid.sock 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 447686 ']' 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:54.768 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:54.768 13:38:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.768 [2024-07-12 13:38:43.182010] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:12:54.768 [2024-07-12 13:38:43.182085] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:54.768 [2024-07-12 13:38:43.312190] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:55.027 [2024-07-12 13:38:43.418471] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:55.027 [2024-07-12 13:38:43.483337] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.027 [2024-07-12 13:38:43.483365] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.596 13:38:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:55.596 13:38:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:55.596 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:55.855 [2024-07-12 13:38:44.274001] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:55.855 [2024-07-12 13:38:44.274057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:55.855 [2024-07-12 13:38:44.274073] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:55.855 [2024-07-12 13:38:44.274086] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:55.855 [2024-07-12 13:38:44.274094] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:55.855 [2024-07-12 13:38:44.274110] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.855 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:56.115 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.115 "name": "Existed_Raid", 00:12:56.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.115 "strip_size_kb": 64, 00:12:56.115 "state": "configuring", 00:12:56.115 "raid_level": "raid0", 00:12:56.115 "superblock": false, 00:12:56.115 "num_base_bdevs": 3, 00:12:56.115 "num_base_bdevs_discovered": 0, 00:12:56.115 "num_base_bdevs_operational": 3, 00:12:56.115 "base_bdevs_list": [ 00:12:56.115 { 00:12:56.115 "name": "BaseBdev1", 00:12:56.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.115 "is_configured": false, 00:12:56.115 "data_offset": 0, 00:12:56.115 "data_size": 0 00:12:56.115 }, 00:12:56.115 { 00:12:56.115 "name": "BaseBdev2", 00:12:56.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.115 "is_configured": false, 00:12:56.115 "data_offset": 0, 00:12:56.115 "data_size": 0 00:12:56.115 }, 00:12:56.115 { 00:12:56.115 "name": "BaseBdev3", 00:12:56.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.115 "is_configured": false, 00:12:56.115 "data_offset": 0, 00:12:56.115 "data_size": 0 00:12:56.115 } 00:12:56.115 ] 00:12:56.115 }' 00:12:56.115 13:38:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.115 13:38:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.683 13:38:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:56.942 [2024-07-12 13:38:45.364764] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:56.942 [2024-07-12 13:38:45.364800] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ae350 name Existed_Raid, state configuring 00:12:56.942 13:38:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:57.202 [2024-07-12 13:38:45.597397] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:57.202 [2024-07-12 13:38:45.597431] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:57.202 [2024-07-12 13:38:45.597441] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:57.202 [2024-07-12 13:38:45.597453] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:57.202 [2024-07-12 13:38:45.597461] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:57.202 [2024-07-12 13:38:45.597473] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:57.202 13:38:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:57.462 [2024-07-12 13:38:45.853255] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:57.462 BaseBdev1 00:12:57.462 13:38:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:57.462 13:38:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:57.462 13:38:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:57.462 13:38:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:57.462 13:38:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:57.462 13:38:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:57.462 13:38:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:57.721 13:38:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:57.981 [ 00:12:57.981 { 00:12:57.981 "name": "BaseBdev1", 00:12:57.981 "aliases": [ 00:12:57.981 "9d996903-11b3-4fd0-a8b2-5778ebd879e8" 00:12:57.981 ], 00:12:57.981 "product_name": "Malloc disk", 00:12:57.981 "block_size": 512, 00:12:57.981 "num_blocks": 65536, 00:12:57.981 "uuid": "9d996903-11b3-4fd0-a8b2-5778ebd879e8", 00:12:57.981 "assigned_rate_limits": { 00:12:57.981 "rw_ios_per_sec": 0, 00:12:57.981 "rw_mbytes_per_sec": 0, 00:12:57.981 "r_mbytes_per_sec": 0, 00:12:57.981 "w_mbytes_per_sec": 0 00:12:57.981 }, 00:12:57.981 "claimed": true, 00:12:57.981 "claim_type": "exclusive_write", 00:12:57.981 "zoned": false, 00:12:57.981 "supported_io_types": { 00:12:57.981 "read": true, 00:12:57.981 "write": true, 00:12:57.981 "unmap": true, 00:12:57.981 "flush": true, 00:12:57.981 "reset": true, 00:12:57.981 "nvme_admin": false, 00:12:57.981 "nvme_io": false, 00:12:57.981 "nvme_io_md": false, 00:12:57.981 "write_zeroes": true, 00:12:57.981 "zcopy": true, 00:12:57.981 "get_zone_info": false, 00:12:57.981 "zone_management": false, 00:12:57.981 "zone_append": false, 00:12:57.981 "compare": false, 00:12:57.981 "compare_and_write": false, 00:12:57.981 "abort": true, 00:12:57.981 "seek_hole": false, 00:12:57.981 "seek_data": false, 00:12:57.981 "copy": true, 00:12:57.981 "nvme_iov_md": false 00:12:57.981 }, 00:12:57.981 "memory_domains": [ 00:12:57.981 { 00:12:57.981 "dma_device_id": "system", 00:12:57.981 "dma_device_type": 1 00:12:57.981 }, 00:12:57.981 { 00:12:57.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.981 "dma_device_type": 2 00:12:57.981 } 00:12:57.981 ], 00:12:57.981 "driver_specific": {} 00:12:57.981 } 00:12:57.981 ] 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.981 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:58.240 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:58.240 "name": "Existed_Raid", 00:12:58.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.240 "strip_size_kb": 64, 00:12:58.240 "state": "configuring", 00:12:58.240 "raid_level": "raid0", 00:12:58.240 "superblock": false, 00:12:58.240 "num_base_bdevs": 3, 00:12:58.240 "num_base_bdevs_discovered": 1, 00:12:58.240 "num_base_bdevs_operational": 3, 00:12:58.240 "base_bdevs_list": [ 00:12:58.240 { 00:12:58.240 "name": "BaseBdev1", 00:12:58.240 "uuid": "9d996903-11b3-4fd0-a8b2-5778ebd879e8", 00:12:58.240 "is_configured": true, 00:12:58.240 "data_offset": 0, 00:12:58.240 "data_size": 65536 00:12:58.240 }, 00:12:58.240 { 00:12:58.240 "name": "BaseBdev2", 00:12:58.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.240 "is_configured": false, 00:12:58.240 "data_offset": 0, 00:12:58.240 "data_size": 0 00:12:58.240 }, 00:12:58.240 { 00:12:58.240 "name": "BaseBdev3", 00:12:58.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.240 "is_configured": false, 00:12:58.240 "data_offset": 0, 00:12:58.240 "data_size": 0 00:12:58.240 } 00:12:58.240 ] 00:12:58.240 }' 00:12:58.240 13:38:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:58.240 13:38:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.808 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:58.808 [2024-07-12 13:38:47.369267] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:58.808 [2024-07-12 13:38:47.369308] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16adc20 name Existed_Raid, state configuring 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:59.068 [2024-07-12 13:38:47.545766] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:59.068 [2024-07-12 13:38:47.547257] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:59.068 [2024-07-12 13:38:47.547291] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:59.068 [2024-07-12 13:38:47.547301] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:59.068 [2024-07-12 13:38:47.547312] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.068 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:59.328 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.328 "name": "Existed_Raid", 00:12:59.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.328 "strip_size_kb": 64, 00:12:59.328 "state": "configuring", 00:12:59.328 "raid_level": "raid0", 00:12:59.328 "superblock": false, 00:12:59.328 "num_base_bdevs": 3, 00:12:59.328 "num_base_bdevs_discovered": 1, 00:12:59.328 "num_base_bdevs_operational": 3, 00:12:59.328 "base_bdevs_list": [ 00:12:59.328 { 00:12:59.328 "name": "BaseBdev1", 00:12:59.328 "uuid": "9d996903-11b3-4fd0-a8b2-5778ebd879e8", 00:12:59.328 "is_configured": true, 00:12:59.328 "data_offset": 0, 00:12:59.328 "data_size": 65536 00:12:59.328 }, 00:12:59.328 { 00:12:59.328 "name": "BaseBdev2", 00:12:59.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.328 "is_configured": false, 00:12:59.328 "data_offset": 0, 00:12:59.328 "data_size": 0 00:12:59.328 }, 00:12:59.328 { 00:12:59.328 "name": "BaseBdev3", 00:12:59.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.328 "is_configured": false, 00:12:59.328 "data_offset": 0, 00:12:59.328 "data_size": 0 00:12:59.328 } 00:12:59.328 ] 00:12:59.328 }' 00:12:59.328 13:38:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.328 13:38:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.896 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:00.155 [2024-07-12 13:38:48.559869] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:00.155 BaseBdev2 00:13:00.155 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:00.155 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:00.155 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:00.155 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:00.155 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:00.155 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:00.155 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:00.414 [ 00:13:00.414 { 00:13:00.414 "name": "BaseBdev2", 00:13:00.414 "aliases": [ 00:13:00.414 "d170ee9a-f843-47dd-81b8-93b61daf5d26" 00:13:00.414 ], 00:13:00.414 "product_name": "Malloc disk", 00:13:00.414 "block_size": 512, 00:13:00.414 "num_blocks": 65536, 00:13:00.414 "uuid": "d170ee9a-f843-47dd-81b8-93b61daf5d26", 00:13:00.414 "assigned_rate_limits": { 00:13:00.414 "rw_ios_per_sec": 0, 00:13:00.414 "rw_mbytes_per_sec": 0, 00:13:00.414 "r_mbytes_per_sec": 0, 00:13:00.414 "w_mbytes_per_sec": 0 00:13:00.414 }, 00:13:00.414 "claimed": true, 00:13:00.414 "claim_type": "exclusive_write", 00:13:00.414 "zoned": false, 00:13:00.414 "supported_io_types": { 00:13:00.414 "read": true, 00:13:00.414 "write": true, 00:13:00.414 "unmap": true, 00:13:00.414 "flush": true, 00:13:00.414 "reset": true, 00:13:00.414 "nvme_admin": false, 00:13:00.414 "nvme_io": false, 00:13:00.414 "nvme_io_md": false, 00:13:00.414 "write_zeroes": true, 00:13:00.414 "zcopy": true, 00:13:00.414 "get_zone_info": false, 00:13:00.414 "zone_management": false, 00:13:00.414 "zone_append": false, 00:13:00.414 "compare": false, 00:13:00.414 "compare_and_write": false, 00:13:00.414 "abort": true, 00:13:00.414 "seek_hole": false, 00:13:00.414 "seek_data": false, 00:13:00.414 "copy": true, 00:13:00.414 "nvme_iov_md": false 00:13:00.414 }, 00:13:00.414 "memory_domains": [ 00:13:00.414 { 00:13:00.414 "dma_device_id": "system", 00:13:00.414 "dma_device_type": 1 00:13:00.414 }, 00:13:00.414 { 00:13:00.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.414 "dma_device_type": 2 00:13:00.414 } 00:13:00.414 ], 00:13:00.414 "driver_specific": {} 00:13:00.414 } 00:13:00.414 ] 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.414 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.415 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.415 13:38:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:00.674 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.674 "name": "Existed_Raid", 00:13:00.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.674 "strip_size_kb": 64, 00:13:00.674 "state": "configuring", 00:13:00.674 "raid_level": "raid0", 00:13:00.674 "superblock": false, 00:13:00.674 "num_base_bdevs": 3, 00:13:00.674 "num_base_bdevs_discovered": 2, 00:13:00.674 "num_base_bdevs_operational": 3, 00:13:00.674 "base_bdevs_list": [ 00:13:00.674 { 00:13:00.674 "name": "BaseBdev1", 00:13:00.674 "uuid": "9d996903-11b3-4fd0-a8b2-5778ebd879e8", 00:13:00.674 "is_configured": true, 00:13:00.674 "data_offset": 0, 00:13:00.674 "data_size": 65536 00:13:00.674 }, 00:13:00.674 { 00:13:00.674 "name": "BaseBdev2", 00:13:00.674 "uuid": "d170ee9a-f843-47dd-81b8-93b61daf5d26", 00:13:00.674 "is_configured": true, 00:13:00.674 "data_offset": 0, 00:13:00.674 "data_size": 65536 00:13:00.674 }, 00:13:00.674 { 00:13:00.674 "name": "BaseBdev3", 00:13:00.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.674 "is_configured": false, 00:13:00.674 "data_offset": 0, 00:13:00.674 "data_size": 0 00:13:00.674 } 00:13:00.674 ] 00:13:00.674 }' 00:13:00.674 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.674 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.242 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:01.501 [2024-07-12 13:38:49.950924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:01.501 [2024-07-12 13:38:49.950970] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16aeb10 00:13:01.501 [2024-07-12 13:38:49.950979] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:01.501 [2024-07-12 13:38:49.951167] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16ae7e0 00:13:01.501 [2024-07-12 13:38:49.951291] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16aeb10 00:13:01.501 [2024-07-12 13:38:49.951301] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16aeb10 00:13:01.501 [2024-07-12 13:38:49.951473] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:01.501 BaseBdev3 00:13:01.501 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:01.501 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:01.501 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:01.501 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:01.501 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:01.501 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:01.501 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:01.760 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:02.018 [ 00:13:02.018 { 00:13:02.018 "name": "BaseBdev3", 00:13:02.018 "aliases": [ 00:13:02.018 "008288c5-b671-46cf-ba42-ded16d0939f7" 00:13:02.018 ], 00:13:02.018 "product_name": "Malloc disk", 00:13:02.018 "block_size": 512, 00:13:02.018 "num_blocks": 65536, 00:13:02.018 "uuid": "008288c5-b671-46cf-ba42-ded16d0939f7", 00:13:02.018 "assigned_rate_limits": { 00:13:02.018 "rw_ios_per_sec": 0, 00:13:02.018 "rw_mbytes_per_sec": 0, 00:13:02.018 "r_mbytes_per_sec": 0, 00:13:02.018 "w_mbytes_per_sec": 0 00:13:02.018 }, 00:13:02.018 "claimed": true, 00:13:02.018 "claim_type": "exclusive_write", 00:13:02.018 "zoned": false, 00:13:02.018 "supported_io_types": { 00:13:02.018 "read": true, 00:13:02.018 "write": true, 00:13:02.018 "unmap": true, 00:13:02.018 "flush": true, 00:13:02.018 "reset": true, 00:13:02.018 "nvme_admin": false, 00:13:02.018 "nvme_io": false, 00:13:02.018 "nvme_io_md": false, 00:13:02.018 "write_zeroes": true, 00:13:02.018 "zcopy": true, 00:13:02.018 "get_zone_info": false, 00:13:02.018 "zone_management": false, 00:13:02.018 "zone_append": false, 00:13:02.018 "compare": false, 00:13:02.018 "compare_and_write": false, 00:13:02.018 "abort": true, 00:13:02.018 "seek_hole": false, 00:13:02.018 "seek_data": false, 00:13:02.018 "copy": true, 00:13:02.018 "nvme_iov_md": false 00:13:02.018 }, 00:13:02.018 "memory_domains": [ 00:13:02.018 { 00:13:02.018 "dma_device_id": "system", 00:13:02.018 "dma_device_type": 1 00:13:02.018 }, 00:13:02.018 { 00:13:02.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.018 "dma_device_type": 2 00:13:02.019 } 00:13:02.019 ], 00:13:02.019 "driver_specific": {} 00:13:02.019 } 00:13:02.019 ] 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.019 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:02.277 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:02.277 "name": "Existed_Raid", 00:13:02.277 "uuid": "d00087b3-7606-440c-9e06-3c35f016bedf", 00:13:02.277 "strip_size_kb": 64, 00:13:02.277 "state": "online", 00:13:02.277 "raid_level": "raid0", 00:13:02.277 "superblock": false, 00:13:02.277 "num_base_bdevs": 3, 00:13:02.277 "num_base_bdevs_discovered": 3, 00:13:02.277 "num_base_bdevs_operational": 3, 00:13:02.277 "base_bdevs_list": [ 00:13:02.277 { 00:13:02.277 "name": "BaseBdev1", 00:13:02.277 "uuid": "9d996903-11b3-4fd0-a8b2-5778ebd879e8", 00:13:02.277 "is_configured": true, 00:13:02.277 "data_offset": 0, 00:13:02.277 "data_size": 65536 00:13:02.277 }, 00:13:02.277 { 00:13:02.277 "name": "BaseBdev2", 00:13:02.277 "uuid": "d170ee9a-f843-47dd-81b8-93b61daf5d26", 00:13:02.277 "is_configured": true, 00:13:02.277 "data_offset": 0, 00:13:02.277 "data_size": 65536 00:13:02.277 }, 00:13:02.277 { 00:13:02.277 "name": "BaseBdev3", 00:13:02.277 "uuid": "008288c5-b671-46cf-ba42-ded16d0939f7", 00:13:02.277 "is_configured": true, 00:13:02.277 "data_offset": 0, 00:13:02.277 "data_size": 65536 00:13:02.277 } 00:13:02.277 ] 00:13:02.277 }' 00:13:02.277 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:02.277 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.843 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:02.843 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:02.843 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:02.843 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:02.843 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:02.843 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:02.843 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:02.843 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:03.102 [2024-07-12 13:38:51.535460] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:03.102 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:03.102 "name": "Existed_Raid", 00:13:03.102 "aliases": [ 00:13:03.102 "d00087b3-7606-440c-9e06-3c35f016bedf" 00:13:03.102 ], 00:13:03.102 "product_name": "Raid Volume", 00:13:03.102 "block_size": 512, 00:13:03.102 "num_blocks": 196608, 00:13:03.102 "uuid": "d00087b3-7606-440c-9e06-3c35f016bedf", 00:13:03.102 "assigned_rate_limits": { 00:13:03.102 "rw_ios_per_sec": 0, 00:13:03.102 "rw_mbytes_per_sec": 0, 00:13:03.102 "r_mbytes_per_sec": 0, 00:13:03.102 "w_mbytes_per_sec": 0 00:13:03.102 }, 00:13:03.102 "claimed": false, 00:13:03.102 "zoned": false, 00:13:03.102 "supported_io_types": { 00:13:03.102 "read": true, 00:13:03.102 "write": true, 00:13:03.102 "unmap": true, 00:13:03.102 "flush": true, 00:13:03.102 "reset": true, 00:13:03.102 "nvme_admin": false, 00:13:03.102 "nvme_io": false, 00:13:03.102 "nvme_io_md": false, 00:13:03.102 "write_zeroes": true, 00:13:03.102 "zcopy": false, 00:13:03.102 "get_zone_info": false, 00:13:03.102 "zone_management": false, 00:13:03.102 "zone_append": false, 00:13:03.102 "compare": false, 00:13:03.102 "compare_and_write": false, 00:13:03.102 "abort": false, 00:13:03.102 "seek_hole": false, 00:13:03.102 "seek_data": false, 00:13:03.102 "copy": false, 00:13:03.102 "nvme_iov_md": false 00:13:03.102 }, 00:13:03.102 "memory_domains": [ 00:13:03.102 { 00:13:03.102 "dma_device_id": "system", 00:13:03.102 "dma_device_type": 1 00:13:03.102 }, 00:13:03.102 { 00:13:03.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.102 "dma_device_type": 2 00:13:03.102 }, 00:13:03.102 { 00:13:03.102 "dma_device_id": "system", 00:13:03.102 "dma_device_type": 1 00:13:03.102 }, 00:13:03.102 { 00:13:03.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.102 "dma_device_type": 2 00:13:03.102 }, 00:13:03.102 { 00:13:03.102 "dma_device_id": "system", 00:13:03.102 "dma_device_type": 1 00:13:03.102 }, 00:13:03.102 { 00:13:03.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.102 "dma_device_type": 2 00:13:03.102 } 00:13:03.102 ], 00:13:03.102 "driver_specific": { 00:13:03.102 "raid": { 00:13:03.102 "uuid": "d00087b3-7606-440c-9e06-3c35f016bedf", 00:13:03.102 "strip_size_kb": 64, 00:13:03.102 "state": "online", 00:13:03.102 "raid_level": "raid0", 00:13:03.102 "superblock": false, 00:13:03.102 "num_base_bdevs": 3, 00:13:03.102 "num_base_bdevs_discovered": 3, 00:13:03.102 "num_base_bdevs_operational": 3, 00:13:03.102 "base_bdevs_list": [ 00:13:03.102 { 00:13:03.102 "name": "BaseBdev1", 00:13:03.102 "uuid": "9d996903-11b3-4fd0-a8b2-5778ebd879e8", 00:13:03.102 "is_configured": true, 00:13:03.102 "data_offset": 0, 00:13:03.102 "data_size": 65536 00:13:03.102 }, 00:13:03.102 { 00:13:03.102 "name": "BaseBdev2", 00:13:03.102 "uuid": "d170ee9a-f843-47dd-81b8-93b61daf5d26", 00:13:03.102 "is_configured": true, 00:13:03.102 "data_offset": 0, 00:13:03.102 "data_size": 65536 00:13:03.102 }, 00:13:03.102 { 00:13:03.102 "name": "BaseBdev3", 00:13:03.102 "uuid": "008288c5-b671-46cf-ba42-ded16d0939f7", 00:13:03.102 "is_configured": true, 00:13:03.102 "data_offset": 0, 00:13:03.102 "data_size": 65536 00:13:03.102 } 00:13:03.102 ] 00:13:03.102 } 00:13:03.102 } 00:13:03.102 }' 00:13:03.102 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:03.102 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:03.102 BaseBdev2 00:13:03.102 BaseBdev3' 00:13:03.102 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:03.102 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:03.102 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:03.361 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:03.361 "name": "BaseBdev1", 00:13:03.361 "aliases": [ 00:13:03.361 "9d996903-11b3-4fd0-a8b2-5778ebd879e8" 00:13:03.361 ], 00:13:03.361 "product_name": "Malloc disk", 00:13:03.361 "block_size": 512, 00:13:03.361 "num_blocks": 65536, 00:13:03.361 "uuid": "9d996903-11b3-4fd0-a8b2-5778ebd879e8", 00:13:03.361 "assigned_rate_limits": { 00:13:03.361 "rw_ios_per_sec": 0, 00:13:03.361 "rw_mbytes_per_sec": 0, 00:13:03.361 "r_mbytes_per_sec": 0, 00:13:03.361 "w_mbytes_per_sec": 0 00:13:03.361 }, 00:13:03.361 "claimed": true, 00:13:03.361 "claim_type": "exclusive_write", 00:13:03.361 "zoned": false, 00:13:03.361 "supported_io_types": { 00:13:03.361 "read": true, 00:13:03.361 "write": true, 00:13:03.361 "unmap": true, 00:13:03.361 "flush": true, 00:13:03.361 "reset": true, 00:13:03.361 "nvme_admin": false, 00:13:03.361 "nvme_io": false, 00:13:03.361 "nvme_io_md": false, 00:13:03.361 "write_zeroes": true, 00:13:03.361 "zcopy": true, 00:13:03.361 "get_zone_info": false, 00:13:03.361 "zone_management": false, 00:13:03.361 "zone_append": false, 00:13:03.361 "compare": false, 00:13:03.361 "compare_and_write": false, 00:13:03.361 "abort": true, 00:13:03.361 "seek_hole": false, 00:13:03.361 "seek_data": false, 00:13:03.361 "copy": true, 00:13:03.361 "nvme_iov_md": false 00:13:03.361 }, 00:13:03.361 "memory_domains": [ 00:13:03.361 { 00:13:03.361 "dma_device_id": "system", 00:13:03.361 "dma_device_type": 1 00:13:03.361 }, 00:13:03.361 { 00:13:03.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.361 "dma_device_type": 2 00:13:03.361 } 00:13:03.361 ], 00:13:03.361 "driver_specific": {} 00:13:03.361 }' 00:13:03.362 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.362 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.362 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:03.362 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.620 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.620 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:03.621 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.621 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.621 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.621 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.621 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.879 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:03.879 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:03.879 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:03.879 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:04.138 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:04.138 "name": "BaseBdev2", 00:13:04.138 "aliases": [ 00:13:04.138 "d170ee9a-f843-47dd-81b8-93b61daf5d26" 00:13:04.138 ], 00:13:04.138 "product_name": "Malloc disk", 00:13:04.138 "block_size": 512, 00:13:04.138 "num_blocks": 65536, 00:13:04.138 "uuid": "d170ee9a-f843-47dd-81b8-93b61daf5d26", 00:13:04.138 "assigned_rate_limits": { 00:13:04.138 "rw_ios_per_sec": 0, 00:13:04.138 "rw_mbytes_per_sec": 0, 00:13:04.138 "r_mbytes_per_sec": 0, 00:13:04.138 "w_mbytes_per_sec": 0 00:13:04.138 }, 00:13:04.138 "claimed": true, 00:13:04.138 "claim_type": "exclusive_write", 00:13:04.138 "zoned": false, 00:13:04.138 "supported_io_types": { 00:13:04.138 "read": true, 00:13:04.138 "write": true, 00:13:04.138 "unmap": true, 00:13:04.138 "flush": true, 00:13:04.138 "reset": true, 00:13:04.138 "nvme_admin": false, 00:13:04.138 "nvme_io": false, 00:13:04.138 "nvme_io_md": false, 00:13:04.138 "write_zeroes": true, 00:13:04.138 "zcopy": true, 00:13:04.139 "get_zone_info": false, 00:13:04.139 "zone_management": false, 00:13:04.139 "zone_append": false, 00:13:04.139 "compare": false, 00:13:04.139 "compare_and_write": false, 00:13:04.139 "abort": true, 00:13:04.139 "seek_hole": false, 00:13:04.139 "seek_data": false, 00:13:04.139 "copy": true, 00:13:04.139 "nvme_iov_md": false 00:13:04.139 }, 00:13:04.139 "memory_domains": [ 00:13:04.139 { 00:13:04.139 "dma_device_id": "system", 00:13:04.139 "dma_device_type": 1 00:13:04.139 }, 00:13:04.139 { 00:13:04.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.139 "dma_device_type": 2 00:13:04.139 } 00:13:04.139 ], 00:13:04.139 "driver_specific": {} 00:13:04.139 }' 00:13:04.139 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.139 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.139 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:04.139 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.139 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.139 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:04.139 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.139 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.397 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.397 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.397 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.397 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.397 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:04.397 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:04.397 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:04.655 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:04.655 "name": "BaseBdev3", 00:13:04.655 "aliases": [ 00:13:04.655 "008288c5-b671-46cf-ba42-ded16d0939f7" 00:13:04.655 ], 00:13:04.655 "product_name": "Malloc disk", 00:13:04.655 "block_size": 512, 00:13:04.655 "num_blocks": 65536, 00:13:04.655 "uuid": "008288c5-b671-46cf-ba42-ded16d0939f7", 00:13:04.655 "assigned_rate_limits": { 00:13:04.655 "rw_ios_per_sec": 0, 00:13:04.655 "rw_mbytes_per_sec": 0, 00:13:04.655 "r_mbytes_per_sec": 0, 00:13:04.655 "w_mbytes_per_sec": 0 00:13:04.655 }, 00:13:04.655 "claimed": true, 00:13:04.655 "claim_type": "exclusive_write", 00:13:04.655 "zoned": false, 00:13:04.655 "supported_io_types": { 00:13:04.655 "read": true, 00:13:04.655 "write": true, 00:13:04.655 "unmap": true, 00:13:04.655 "flush": true, 00:13:04.655 "reset": true, 00:13:04.655 "nvme_admin": false, 00:13:04.655 "nvme_io": false, 00:13:04.655 "nvme_io_md": false, 00:13:04.655 "write_zeroes": true, 00:13:04.655 "zcopy": true, 00:13:04.655 "get_zone_info": false, 00:13:04.655 "zone_management": false, 00:13:04.655 "zone_append": false, 00:13:04.655 "compare": false, 00:13:04.655 "compare_and_write": false, 00:13:04.655 "abort": true, 00:13:04.655 "seek_hole": false, 00:13:04.655 "seek_data": false, 00:13:04.655 "copy": true, 00:13:04.655 "nvme_iov_md": false 00:13:04.655 }, 00:13:04.655 "memory_domains": [ 00:13:04.655 { 00:13:04.655 "dma_device_id": "system", 00:13:04.655 "dma_device_type": 1 00:13:04.655 }, 00:13:04.655 { 00:13:04.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.655 "dma_device_type": 2 00:13:04.655 } 00:13:04.655 ], 00:13:04.655 "driver_specific": {} 00:13:04.655 }' 00:13:04.655 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.655 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.655 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:04.655 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.913 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.913 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:04.913 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.913 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.913 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.913 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.913 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.913 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.913 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:05.173 [2024-07-12 13:38:53.676872] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:05.173 [2024-07-12 13:38:53.676900] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:05.173 [2024-07-12 13:38:53.676950] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.173 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:05.432 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.432 "name": "Existed_Raid", 00:13:05.432 "uuid": "d00087b3-7606-440c-9e06-3c35f016bedf", 00:13:05.432 "strip_size_kb": 64, 00:13:05.432 "state": "offline", 00:13:05.432 "raid_level": "raid0", 00:13:05.432 "superblock": false, 00:13:05.432 "num_base_bdevs": 3, 00:13:05.432 "num_base_bdevs_discovered": 2, 00:13:05.432 "num_base_bdevs_operational": 2, 00:13:05.432 "base_bdevs_list": [ 00:13:05.432 { 00:13:05.432 "name": null, 00:13:05.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:05.432 "is_configured": false, 00:13:05.432 "data_offset": 0, 00:13:05.432 "data_size": 65536 00:13:05.432 }, 00:13:05.432 { 00:13:05.432 "name": "BaseBdev2", 00:13:05.432 "uuid": "d170ee9a-f843-47dd-81b8-93b61daf5d26", 00:13:05.432 "is_configured": true, 00:13:05.432 "data_offset": 0, 00:13:05.432 "data_size": 65536 00:13:05.432 }, 00:13:05.432 { 00:13:05.432 "name": "BaseBdev3", 00:13:05.432 "uuid": "008288c5-b671-46cf-ba42-ded16d0939f7", 00:13:05.432 "is_configured": true, 00:13:05.432 "data_offset": 0, 00:13:05.432 "data_size": 65536 00:13:05.432 } 00:13:05.432 ] 00:13:05.432 }' 00:13:05.432 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.432 13:38:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.013 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:06.013 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:06.013 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.013 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:06.272 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:06.272 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:06.272 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:06.530 [2024-07-12 13:38:55.017545] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:06.530 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:06.530 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:06.530 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:06.530 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.788 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:06.788 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:06.788 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:07.046 [2024-07-12 13:38:55.519281] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:07.046 [2024-07-12 13:38:55.519329] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16aeb10 name Existed_Raid, state offline 00:13:07.046 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:07.046 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:07.046 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.046 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:07.304 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:07.304 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:07.304 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:07.304 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:07.304 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:07.304 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:07.562 BaseBdev2 00:13:07.562 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:07.562 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:07.562 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:07.562 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:07.562 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:07.562 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:07.562 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:07.821 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:08.080 [ 00:13:08.080 { 00:13:08.080 "name": "BaseBdev2", 00:13:08.080 "aliases": [ 00:13:08.080 "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e" 00:13:08.080 ], 00:13:08.080 "product_name": "Malloc disk", 00:13:08.080 "block_size": 512, 00:13:08.080 "num_blocks": 65536, 00:13:08.080 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:08.080 "assigned_rate_limits": { 00:13:08.080 "rw_ios_per_sec": 0, 00:13:08.080 "rw_mbytes_per_sec": 0, 00:13:08.080 "r_mbytes_per_sec": 0, 00:13:08.080 "w_mbytes_per_sec": 0 00:13:08.080 }, 00:13:08.080 "claimed": false, 00:13:08.080 "zoned": false, 00:13:08.080 "supported_io_types": { 00:13:08.080 "read": true, 00:13:08.080 "write": true, 00:13:08.080 "unmap": true, 00:13:08.080 "flush": true, 00:13:08.080 "reset": true, 00:13:08.080 "nvme_admin": false, 00:13:08.080 "nvme_io": false, 00:13:08.080 "nvme_io_md": false, 00:13:08.080 "write_zeroes": true, 00:13:08.080 "zcopy": true, 00:13:08.080 "get_zone_info": false, 00:13:08.080 "zone_management": false, 00:13:08.080 "zone_append": false, 00:13:08.080 "compare": false, 00:13:08.080 "compare_and_write": false, 00:13:08.080 "abort": true, 00:13:08.080 "seek_hole": false, 00:13:08.080 "seek_data": false, 00:13:08.080 "copy": true, 00:13:08.080 "nvme_iov_md": false 00:13:08.080 }, 00:13:08.080 "memory_domains": [ 00:13:08.080 { 00:13:08.080 "dma_device_id": "system", 00:13:08.080 "dma_device_type": 1 00:13:08.080 }, 00:13:08.080 { 00:13:08.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:08.080 "dma_device_type": 2 00:13:08.080 } 00:13:08.080 ], 00:13:08.080 "driver_specific": {} 00:13:08.080 } 00:13:08.080 ] 00:13:08.080 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:08.080 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:08.080 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:08.080 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:08.339 BaseBdev3 00:13:08.339 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:08.339 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:08.339 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:08.339 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:08.339 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:08.339 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:08.339 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:08.597 13:38:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:08.856 [ 00:13:08.856 { 00:13:08.856 "name": "BaseBdev3", 00:13:08.856 "aliases": [ 00:13:08.856 "346d1e03-7026-4619-b803-a8f5135562db" 00:13:08.856 ], 00:13:08.856 "product_name": "Malloc disk", 00:13:08.856 "block_size": 512, 00:13:08.856 "num_blocks": 65536, 00:13:08.856 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:08.856 "assigned_rate_limits": { 00:13:08.856 "rw_ios_per_sec": 0, 00:13:08.856 "rw_mbytes_per_sec": 0, 00:13:08.856 "r_mbytes_per_sec": 0, 00:13:08.856 "w_mbytes_per_sec": 0 00:13:08.856 }, 00:13:08.856 "claimed": false, 00:13:08.856 "zoned": false, 00:13:08.856 "supported_io_types": { 00:13:08.856 "read": true, 00:13:08.856 "write": true, 00:13:08.856 "unmap": true, 00:13:08.856 "flush": true, 00:13:08.856 "reset": true, 00:13:08.856 "nvme_admin": false, 00:13:08.856 "nvme_io": false, 00:13:08.856 "nvme_io_md": false, 00:13:08.856 "write_zeroes": true, 00:13:08.856 "zcopy": true, 00:13:08.856 "get_zone_info": false, 00:13:08.856 "zone_management": false, 00:13:08.856 "zone_append": false, 00:13:08.856 "compare": false, 00:13:08.856 "compare_and_write": false, 00:13:08.856 "abort": true, 00:13:08.856 "seek_hole": false, 00:13:08.856 "seek_data": false, 00:13:08.856 "copy": true, 00:13:08.856 "nvme_iov_md": false 00:13:08.856 }, 00:13:08.856 "memory_domains": [ 00:13:08.856 { 00:13:08.856 "dma_device_id": "system", 00:13:08.856 "dma_device_type": 1 00:13:08.856 }, 00:13:08.856 { 00:13:08.856 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:08.856 "dma_device_type": 2 00:13:08.856 } 00:13:08.856 ], 00:13:08.856 "driver_specific": {} 00:13:08.856 } 00:13:08.856 ] 00:13:08.856 13:38:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:08.856 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:08.856 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:08.856 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:09.114 [2024-07-12 13:38:57.518992] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:09.114 [2024-07-12 13:38:57.519036] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:09.114 [2024-07-12 13:38:57.519055] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:09.114 [2024-07-12 13:38:57.520394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:09.114 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:09.114 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:09.114 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:09.114 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:09.114 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:09.114 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:09.114 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:09.114 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:09.114 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:09.115 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:09.115 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.115 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:09.374 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.374 "name": "Existed_Raid", 00:13:09.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.374 "strip_size_kb": 64, 00:13:09.374 "state": "configuring", 00:13:09.374 "raid_level": "raid0", 00:13:09.374 "superblock": false, 00:13:09.374 "num_base_bdevs": 3, 00:13:09.374 "num_base_bdevs_discovered": 2, 00:13:09.374 "num_base_bdevs_operational": 3, 00:13:09.374 "base_bdevs_list": [ 00:13:09.374 { 00:13:09.374 "name": "BaseBdev1", 00:13:09.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.374 "is_configured": false, 00:13:09.374 "data_offset": 0, 00:13:09.374 "data_size": 0 00:13:09.374 }, 00:13:09.374 { 00:13:09.374 "name": "BaseBdev2", 00:13:09.374 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:09.374 "is_configured": true, 00:13:09.374 "data_offset": 0, 00:13:09.374 "data_size": 65536 00:13:09.374 }, 00:13:09.374 { 00:13:09.374 "name": "BaseBdev3", 00:13:09.374 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:09.374 "is_configured": true, 00:13:09.374 "data_offset": 0, 00:13:09.374 "data_size": 65536 00:13:09.374 } 00:13:09.374 ] 00:13:09.374 }' 00:13:09.374 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.374 13:38:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.311 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:10.311 [2024-07-12 13:38:58.870549] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.571 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.571 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.571 "name": "Existed_Raid", 00:13:10.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.571 "strip_size_kb": 64, 00:13:10.571 "state": "configuring", 00:13:10.571 "raid_level": "raid0", 00:13:10.571 "superblock": false, 00:13:10.571 "num_base_bdevs": 3, 00:13:10.571 "num_base_bdevs_discovered": 1, 00:13:10.571 "num_base_bdevs_operational": 3, 00:13:10.571 "base_bdevs_list": [ 00:13:10.571 { 00:13:10.571 "name": "BaseBdev1", 00:13:10.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.571 "is_configured": false, 00:13:10.571 "data_offset": 0, 00:13:10.571 "data_size": 0 00:13:10.571 }, 00:13:10.571 { 00:13:10.571 "name": null, 00:13:10.571 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:10.571 "is_configured": false, 00:13:10.571 "data_offset": 0, 00:13:10.571 "data_size": 65536 00:13:10.571 }, 00:13:10.571 { 00:13:10.571 "name": "BaseBdev3", 00:13:10.571 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:10.571 "is_configured": true, 00:13:10.571 "data_offset": 0, 00:13:10.571 "data_size": 65536 00:13:10.571 } 00:13:10.571 ] 00:13:10.571 }' 00:13:10.571 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.571 13:38:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.139 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.139 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:11.398 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:11.398 13:38:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:11.657 [2024-07-12 13:39:00.174334] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.657 BaseBdev1 00:13:11.657 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:11.657 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:11.657 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:11.657 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:11.657 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:11.657 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:11.657 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:12.225 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:12.484 [ 00:13:12.484 { 00:13:12.484 "name": "BaseBdev1", 00:13:12.484 "aliases": [ 00:13:12.484 "431c2aad-1ad8-4650-adef-0e0088ab9940" 00:13:12.484 ], 00:13:12.484 "product_name": "Malloc disk", 00:13:12.484 "block_size": 512, 00:13:12.484 "num_blocks": 65536, 00:13:12.484 "uuid": "431c2aad-1ad8-4650-adef-0e0088ab9940", 00:13:12.484 "assigned_rate_limits": { 00:13:12.484 "rw_ios_per_sec": 0, 00:13:12.484 "rw_mbytes_per_sec": 0, 00:13:12.484 "r_mbytes_per_sec": 0, 00:13:12.484 "w_mbytes_per_sec": 0 00:13:12.484 }, 00:13:12.484 "claimed": true, 00:13:12.484 "claim_type": "exclusive_write", 00:13:12.484 "zoned": false, 00:13:12.484 "supported_io_types": { 00:13:12.484 "read": true, 00:13:12.484 "write": true, 00:13:12.484 "unmap": true, 00:13:12.484 "flush": true, 00:13:12.484 "reset": true, 00:13:12.484 "nvme_admin": false, 00:13:12.484 "nvme_io": false, 00:13:12.484 "nvme_io_md": false, 00:13:12.484 "write_zeroes": true, 00:13:12.485 "zcopy": true, 00:13:12.485 "get_zone_info": false, 00:13:12.485 "zone_management": false, 00:13:12.485 "zone_append": false, 00:13:12.485 "compare": false, 00:13:12.485 "compare_and_write": false, 00:13:12.485 "abort": true, 00:13:12.485 "seek_hole": false, 00:13:12.485 "seek_data": false, 00:13:12.485 "copy": true, 00:13:12.485 "nvme_iov_md": false 00:13:12.485 }, 00:13:12.485 "memory_domains": [ 00:13:12.485 { 00:13:12.485 "dma_device_id": "system", 00:13:12.485 "dma_device_type": 1 00:13:12.485 }, 00:13:12.485 { 00:13:12.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.485 "dma_device_type": 2 00:13:12.485 } 00:13:12.485 ], 00:13:12.485 "driver_specific": {} 00:13:12.485 } 00:13:12.485 ] 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.485 13:39:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.744 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.744 "name": "Existed_Raid", 00:13:12.744 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.744 "strip_size_kb": 64, 00:13:12.744 "state": "configuring", 00:13:12.744 "raid_level": "raid0", 00:13:12.744 "superblock": false, 00:13:12.744 "num_base_bdevs": 3, 00:13:12.744 "num_base_bdevs_discovered": 2, 00:13:12.744 "num_base_bdevs_operational": 3, 00:13:12.744 "base_bdevs_list": [ 00:13:12.744 { 00:13:12.744 "name": "BaseBdev1", 00:13:12.744 "uuid": "431c2aad-1ad8-4650-adef-0e0088ab9940", 00:13:12.744 "is_configured": true, 00:13:12.744 "data_offset": 0, 00:13:12.744 "data_size": 65536 00:13:12.744 }, 00:13:12.744 { 00:13:12.744 "name": null, 00:13:12.744 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:12.744 "is_configured": false, 00:13:12.744 "data_offset": 0, 00:13:12.744 "data_size": 65536 00:13:12.744 }, 00:13:12.744 { 00:13:12.744 "name": "BaseBdev3", 00:13:12.744 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:12.744 "is_configured": true, 00:13:12.744 "data_offset": 0, 00:13:12.744 "data_size": 65536 00:13:12.744 } 00:13:12.744 ] 00:13:12.744 }' 00:13:12.744 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.744 13:39:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.311 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:13.311 13:39:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.568 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:13.568 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:14.134 [2024-07-12 13:39:02.592767] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:14.134 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:14.134 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:14.134 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:14.134 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:14.134 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:14.134 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:14.135 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:14.135 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:14.135 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:14.135 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:14.135 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.135 13:39:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:14.702 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:14.702 "name": "Existed_Raid", 00:13:14.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:14.702 "strip_size_kb": 64, 00:13:14.702 "state": "configuring", 00:13:14.702 "raid_level": "raid0", 00:13:14.702 "superblock": false, 00:13:14.702 "num_base_bdevs": 3, 00:13:14.702 "num_base_bdevs_discovered": 1, 00:13:14.702 "num_base_bdevs_operational": 3, 00:13:14.702 "base_bdevs_list": [ 00:13:14.702 { 00:13:14.702 "name": "BaseBdev1", 00:13:14.702 "uuid": "431c2aad-1ad8-4650-adef-0e0088ab9940", 00:13:14.702 "is_configured": true, 00:13:14.702 "data_offset": 0, 00:13:14.702 "data_size": 65536 00:13:14.702 }, 00:13:14.702 { 00:13:14.702 "name": null, 00:13:14.702 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:14.702 "is_configured": false, 00:13:14.702 "data_offset": 0, 00:13:14.702 "data_size": 65536 00:13:14.702 }, 00:13:14.702 { 00:13:14.703 "name": null, 00:13:14.703 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:14.703 "is_configured": false, 00:13:14.703 "data_offset": 0, 00:13:14.703 "data_size": 65536 00:13:14.703 } 00:13:14.703 ] 00:13:14.703 }' 00:13:14.703 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:14.703 13:39:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:15.269 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.269 13:39:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:15.555 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:15.555 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:16.123 [2024-07-12 13:39:04.513883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.123 13:39:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.697 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.697 "name": "Existed_Raid", 00:13:16.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.697 "strip_size_kb": 64, 00:13:16.697 "state": "configuring", 00:13:16.697 "raid_level": "raid0", 00:13:16.697 "superblock": false, 00:13:16.697 "num_base_bdevs": 3, 00:13:16.697 "num_base_bdevs_discovered": 2, 00:13:16.697 "num_base_bdevs_operational": 3, 00:13:16.697 "base_bdevs_list": [ 00:13:16.697 { 00:13:16.697 "name": "BaseBdev1", 00:13:16.697 "uuid": "431c2aad-1ad8-4650-adef-0e0088ab9940", 00:13:16.697 "is_configured": true, 00:13:16.697 "data_offset": 0, 00:13:16.697 "data_size": 65536 00:13:16.697 }, 00:13:16.697 { 00:13:16.697 "name": null, 00:13:16.697 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:16.697 "is_configured": false, 00:13:16.697 "data_offset": 0, 00:13:16.697 "data_size": 65536 00:13:16.697 }, 00:13:16.697 { 00:13:16.697 "name": "BaseBdev3", 00:13:16.697 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:16.697 "is_configured": true, 00:13:16.697 "data_offset": 0, 00:13:16.697 "data_size": 65536 00:13:16.697 } 00:13:16.697 ] 00:13:16.697 }' 00:13:16.697 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.697 13:39:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.262 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.263 13:39:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:17.829 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:17.829 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:18.396 [2024-07-12 13:39:06.731773] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.396 13:39:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:18.655 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:18.655 "name": "Existed_Raid", 00:13:18.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:18.655 "strip_size_kb": 64, 00:13:18.655 "state": "configuring", 00:13:18.655 "raid_level": "raid0", 00:13:18.655 "superblock": false, 00:13:18.655 "num_base_bdevs": 3, 00:13:18.655 "num_base_bdevs_discovered": 1, 00:13:18.655 "num_base_bdevs_operational": 3, 00:13:18.655 "base_bdevs_list": [ 00:13:18.655 { 00:13:18.655 "name": null, 00:13:18.655 "uuid": "431c2aad-1ad8-4650-adef-0e0088ab9940", 00:13:18.655 "is_configured": false, 00:13:18.655 "data_offset": 0, 00:13:18.655 "data_size": 65536 00:13:18.655 }, 00:13:18.655 { 00:13:18.655 "name": null, 00:13:18.655 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:18.655 "is_configured": false, 00:13:18.655 "data_offset": 0, 00:13:18.655 "data_size": 65536 00:13:18.655 }, 00:13:18.655 { 00:13:18.655 "name": "BaseBdev3", 00:13:18.655 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:18.655 "is_configured": true, 00:13:18.655 "data_offset": 0, 00:13:18.655 "data_size": 65536 00:13:18.655 } 00:13:18.655 ] 00:13:18.655 }' 00:13:18.655 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:18.655 13:39:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.592 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.592 13:39:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:19.592 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:19.592 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:20.158 [2024-07-12 13:39:08.631354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:20.158 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:20.158 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:20.158 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:20.159 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:20.159 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:20.159 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:20.159 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.159 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.159 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.159 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.159 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.159 13:39:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:20.726 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.726 "name": "Existed_Raid", 00:13:20.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.726 "strip_size_kb": 64, 00:13:20.726 "state": "configuring", 00:13:20.726 "raid_level": "raid0", 00:13:20.726 "superblock": false, 00:13:20.726 "num_base_bdevs": 3, 00:13:20.726 "num_base_bdevs_discovered": 2, 00:13:20.726 "num_base_bdevs_operational": 3, 00:13:20.726 "base_bdevs_list": [ 00:13:20.726 { 00:13:20.726 "name": null, 00:13:20.726 "uuid": "431c2aad-1ad8-4650-adef-0e0088ab9940", 00:13:20.726 "is_configured": false, 00:13:20.726 "data_offset": 0, 00:13:20.726 "data_size": 65536 00:13:20.726 }, 00:13:20.726 { 00:13:20.726 "name": "BaseBdev2", 00:13:20.726 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:20.726 "is_configured": true, 00:13:20.726 "data_offset": 0, 00:13:20.726 "data_size": 65536 00:13:20.726 }, 00:13:20.726 { 00:13:20.726 "name": "BaseBdev3", 00:13:20.726 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:20.726 "is_configured": true, 00:13:20.726 "data_offset": 0, 00:13:20.726 "data_size": 65536 00:13:20.726 } 00:13:20.726 ] 00:13:20.726 }' 00:13:20.726 13:39:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.726 13:39:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.663 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:21.663 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.922 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:21.922 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.922 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:22.181 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 431c2aad-1ad8-4650-adef-0e0088ab9940 00:13:22.440 [2024-07-12 13:39:10.876701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:22.440 [2024-07-12 13:39:10.876736] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16acec0 00:13:22.440 [2024-07-12 13:39:10.876745] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:22.440 [2024-07-12 13:39:10.876944] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15c8cd0 00:13:22.440 [2024-07-12 13:39:10.877066] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16acec0 00:13:22.440 [2024-07-12 13:39:10.877077] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16acec0 00:13:22.440 [2024-07-12 13:39:10.877238] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:22.440 NewBaseBdev 00:13:22.440 13:39:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:22.440 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:22.440 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:22.440 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:22.440 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:22.440 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:22.440 13:39:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:22.699 13:39:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:22.958 [ 00:13:22.958 { 00:13:22.958 "name": "NewBaseBdev", 00:13:22.958 "aliases": [ 00:13:22.958 "431c2aad-1ad8-4650-adef-0e0088ab9940" 00:13:22.958 ], 00:13:22.958 "product_name": "Malloc disk", 00:13:22.958 "block_size": 512, 00:13:22.958 "num_blocks": 65536, 00:13:22.958 "uuid": "431c2aad-1ad8-4650-adef-0e0088ab9940", 00:13:22.958 "assigned_rate_limits": { 00:13:22.958 "rw_ios_per_sec": 0, 00:13:22.958 "rw_mbytes_per_sec": 0, 00:13:22.958 "r_mbytes_per_sec": 0, 00:13:22.958 "w_mbytes_per_sec": 0 00:13:22.958 }, 00:13:22.958 "claimed": true, 00:13:22.958 "claim_type": "exclusive_write", 00:13:22.958 "zoned": false, 00:13:22.958 "supported_io_types": { 00:13:22.958 "read": true, 00:13:22.958 "write": true, 00:13:22.958 "unmap": true, 00:13:22.958 "flush": true, 00:13:22.958 "reset": true, 00:13:22.958 "nvme_admin": false, 00:13:22.958 "nvme_io": false, 00:13:22.958 "nvme_io_md": false, 00:13:22.958 "write_zeroes": true, 00:13:22.958 "zcopy": true, 00:13:22.958 "get_zone_info": false, 00:13:22.958 "zone_management": false, 00:13:22.958 "zone_append": false, 00:13:22.958 "compare": false, 00:13:22.958 "compare_and_write": false, 00:13:22.958 "abort": true, 00:13:22.958 "seek_hole": false, 00:13:22.958 "seek_data": false, 00:13:22.958 "copy": true, 00:13:22.958 "nvme_iov_md": false 00:13:22.958 }, 00:13:22.958 "memory_domains": [ 00:13:22.958 { 00:13:22.958 "dma_device_id": "system", 00:13:22.958 "dma_device_type": 1 00:13:22.958 }, 00:13:22.958 { 00:13:22.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.958 "dma_device_type": 2 00:13:22.958 } 00:13:22.958 ], 00:13:22.958 "driver_specific": {} 00:13:22.958 } 00:13:22.958 ] 00:13:22.958 13:39:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:22.958 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:22.958 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:22.958 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:22.958 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:22.958 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:22.958 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:22.958 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.959 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.959 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.959 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.959 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.959 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.218 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:23.218 "name": "Existed_Raid", 00:13:23.218 "uuid": "37ea88cc-de35-4faf-923c-b9357ad48379", 00:13:23.218 "strip_size_kb": 64, 00:13:23.218 "state": "online", 00:13:23.218 "raid_level": "raid0", 00:13:23.218 "superblock": false, 00:13:23.218 "num_base_bdevs": 3, 00:13:23.218 "num_base_bdevs_discovered": 3, 00:13:23.218 "num_base_bdevs_operational": 3, 00:13:23.218 "base_bdevs_list": [ 00:13:23.218 { 00:13:23.218 "name": "NewBaseBdev", 00:13:23.218 "uuid": "431c2aad-1ad8-4650-adef-0e0088ab9940", 00:13:23.218 "is_configured": true, 00:13:23.218 "data_offset": 0, 00:13:23.218 "data_size": 65536 00:13:23.218 }, 00:13:23.218 { 00:13:23.218 "name": "BaseBdev2", 00:13:23.218 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:23.218 "is_configured": true, 00:13:23.218 "data_offset": 0, 00:13:23.218 "data_size": 65536 00:13:23.218 }, 00:13:23.218 { 00:13:23.218 "name": "BaseBdev3", 00:13:23.218 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:23.218 "is_configured": true, 00:13:23.218 "data_offset": 0, 00:13:23.218 "data_size": 65536 00:13:23.218 } 00:13:23.218 ] 00:13:23.218 }' 00:13:23.218 13:39:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:23.218 13:39:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.155 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:24.155 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:24.155 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:24.155 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:24.155 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:24.155 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:24.155 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:24.155 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:24.414 [2024-07-12 13:39:12.826180] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:24.414 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:24.414 "name": "Existed_Raid", 00:13:24.414 "aliases": [ 00:13:24.414 "37ea88cc-de35-4faf-923c-b9357ad48379" 00:13:24.414 ], 00:13:24.414 "product_name": "Raid Volume", 00:13:24.414 "block_size": 512, 00:13:24.414 "num_blocks": 196608, 00:13:24.414 "uuid": "37ea88cc-de35-4faf-923c-b9357ad48379", 00:13:24.414 "assigned_rate_limits": { 00:13:24.414 "rw_ios_per_sec": 0, 00:13:24.414 "rw_mbytes_per_sec": 0, 00:13:24.414 "r_mbytes_per_sec": 0, 00:13:24.414 "w_mbytes_per_sec": 0 00:13:24.414 }, 00:13:24.414 "claimed": false, 00:13:24.414 "zoned": false, 00:13:24.414 "supported_io_types": { 00:13:24.414 "read": true, 00:13:24.414 "write": true, 00:13:24.414 "unmap": true, 00:13:24.414 "flush": true, 00:13:24.414 "reset": true, 00:13:24.414 "nvme_admin": false, 00:13:24.414 "nvme_io": false, 00:13:24.414 "nvme_io_md": false, 00:13:24.414 "write_zeroes": true, 00:13:24.414 "zcopy": false, 00:13:24.414 "get_zone_info": false, 00:13:24.414 "zone_management": false, 00:13:24.414 "zone_append": false, 00:13:24.414 "compare": false, 00:13:24.414 "compare_and_write": false, 00:13:24.414 "abort": false, 00:13:24.414 "seek_hole": false, 00:13:24.414 "seek_data": false, 00:13:24.414 "copy": false, 00:13:24.414 "nvme_iov_md": false 00:13:24.414 }, 00:13:24.414 "memory_domains": [ 00:13:24.414 { 00:13:24.414 "dma_device_id": "system", 00:13:24.414 "dma_device_type": 1 00:13:24.414 }, 00:13:24.414 { 00:13:24.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.414 "dma_device_type": 2 00:13:24.414 }, 00:13:24.414 { 00:13:24.414 "dma_device_id": "system", 00:13:24.414 "dma_device_type": 1 00:13:24.414 }, 00:13:24.414 { 00:13:24.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.414 "dma_device_type": 2 00:13:24.414 }, 00:13:24.414 { 00:13:24.414 "dma_device_id": "system", 00:13:24.414 "dma_device_type": 1 00:13:24.414 }, 00:13:24.414 { 00:13:24.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.414 "dma_device_type": 2 00:13:24.414 } 00:13:24.414 ], 00:13:24.414 "driver_specific": { 00:13:24.414 "raid": { 00:13:24.414 "uuid": "37ea88cc-de35-4faf-923c-b9357ad48379", 00:13:24.414 "strip_size_kb": 64, 00:13:24.414 "state": "online", 00:13:24.414 "raid_level": "raid0", 00:13:24.414 "superblock": false, 00:13:24.414 "num_base_bdevs": 3, 00:13:24.414 "num_base_bdevs_discovered": 3, 00:13:24.414 "num_base_bdevs_operational": 3, 00:13:24.414 "base_bdevs_list": [ 00:13:24.414 { 00:13:24.414 "name": "NewBaseBdev", 00:13:24.414 "uuid": "431c2aad-1ad8-4650-adef-0e0088ab9940", 00:13:24.414 "is_configured": true, 00:13:24.414 "data_offset": 0, 00:13:24.414 "data_size": 65536 00:13:24.414 }, 00:13:24.414 { 00:13:24.414 "name": "BaseBdev2", 00:13:24.414 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:24.414 "is_configured": true, 00:13:24.414 "data_offset": 0, 00:13:24.414 "data_size": 65536 00:13:24.414 }, 00:13:24.414 { 00:13:24.414 "name": "BaseBdev3", 00:13:24.414 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:24.414 "is_configured": true, 00:13:24.414 "data_offset": 0, 00:13:24.414 "data_size": 65536 00:13:24.414 } 00:13:24.414 ] 00:13:24.414 } 00:13:24.414 } 00:13:24.414 }' 00:13:24.414 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:24.414 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:24.414 BaseBdev2 00:13:24.414 BaseBdev3' 00:13:24.414 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:24.414 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:24.415 13:39:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:24.981 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:24.981 "name": "NewBaseBdev", 00:13:24.981 "aliases": [ 00:13:24.981 "431c2aad-1ad8-4650-adef-0e0088ab9940" 00:13:24.981 ], 00:13:24.981 "product_name": "Malloc disk", 00:13:24.981 "block_size": 512, 00:13:24.981 "num_blocks": 65536, 00:13:24.981 "uuid": "431c2aad-1ad8-4650-adef-0e0088ab9940", 00:13:24.981 "assigned_rate_limits": { 00:13:24.981 "rw_ios_per_sec": 0, 00:13:24.981 "rw_mbytes_per_sec": 0, 00:13:24.981 "r_mbytes_per_sec": 0, 00:13:24.981 "w_mbytes_per_sec": 0 00:13:24.981 }, 00:13:24.981 "claimed": true, 00:13:24.981 "claim_type": "exclusive_write", 00:13:24.981 "zoned": false, 00:13:24.981 "supported_io_types": { 00:13:24.981 "read": true, 00:13:24.981 "write": true, 00:13:24.981 "unmap": true, 00:13:24.981 "flush": true, 00:13:24.981 "reset": true, 00:13:24.981 "nvme_admin": false, 00:13:24.981 "nvme_io": false, 00:13:24.981 "nvme_io_md": false, 00:13:24.981 "write_zeroes": true, 00:13:24.981 "zcopy": true, 00:13:24.981 "get_zone_info": false, 00:13:24.981 "zone_management": false, 00:13:24.981 "zone_append": false, 00:13:24.981 "compare": false, 00:13:24.981 "compare_and_write": false, 00:13:24.981 "abort": true, 00:13:24.981 "seek_hole": false, 00:13:24.981 "seek_data": false, 00:13:24.981 "copy": true, 00:13:24.981 "nvme_iov_md": false 00:13:24.981 }, 00:13:24.981 "memory_domains": [ 00:13:24.981 { 00:13:24.981 "dma_device_id": "system", 00:13:24.981 "dma_device_type": 1 00:13:24.981 }, 00:13:24.981 { 00:13:24.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.981 "dma_device_type": 2 00:13:24.981 } 00:13:24.981 ], 00:13:24.981 "driver_specific": {} 00:13:24.981 }' 00:13:24.981 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.981 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.981 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:24.981 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.239 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.239 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:25.239 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.239 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.239 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:25.239 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.239 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.239 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:25.239 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:25.239 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:25.240 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:25.498 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:25.498 "name": "BaseBdev2", 00:13:25.498 "aliases": [ 00:13:25.498 "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e" 00:13:25.498 ], 00:13:25.498 "product_name": "Malloc disk", 00:13:25.498 "block_size": 512, 00:13:25.498 "num_blocks": 65536, 00:13:25.498 "uuid": "ea4c39d2-c5b0-45b7-8c16-e2efba9b253e", 00:13:25.498 "assigned_rate_limits": { 00:13:25.498 "rw_ios_per_sec": 0, 00:13:25.498 "rw_mbytes_per_sec": 0, 00:13:25.498 "r_mbytes_per_sec": 0, 00:13:25.498 "w_mbytes_per_sec": 0 00:13:25.498 }, 00:13:25.498 "claimed": true, 00:13:25.498 "claim_type": "exclusive_write", 00:13:25.498 "zoned": false, 00:13:25.498 "supported_io_types": { 00:13:25.498 "read": true, 00:13:25.498 "write": true, 00:13:25.498 "unmap": true, 00:13:25.498 "flush": true, 00:13:25.498 "reset": true, 00:13:25.498 "nvme_admin": false, 00:13:25.498 "nvme_io": false, 00:13:25.498 "nvme_io_md": false, 00:13:25.498 "write_zeroes": true, 00:13:25.498 "zcopy": true, 00:13:25.498 "get_zone_info": false, 00:13:25.498 "zone_management": false, 00:13:25.498 "zone_append": false, 00:13:25.498 "compare": false, 00:13:25.498 "compare_and_write": false, 00:13:25.498 "abort": true, 00:13:25.498 "seek_hole": false, 00:13:25.498 "seek_data": false, 00:13:25.498 "copy": true, 00:13:25.498 "nvme_iov_md": false 00:13:25.498 }, 00:13:25.498 "memory_domains": [ 00:13:25.498 { 00:13:25.498 "dma_device_id": "system", 00:13:25.498 "dma_device_type": 1 00:13:25.498 }, 00:13:25.498 { 00:13:25.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.498 "dma_device_type": 2 00:13:25.498 } 00:13:25.498 ], 00:13:25.498 "driver_specific": {} 00:13:25.498 }' 00:13:25.498 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.498 13:39:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.498 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:25.498 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.498 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.757 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:25.757 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.757 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.757 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:25.757 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.757 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.757 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:25.757 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:25.757 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:25.757 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:26.017 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:26.017 "name": "BaseBdev3", 00:13:26.017 "aliases": [ 00:13:26.017 "346d1e03-7026-4619-b803-a8f5135562db" 00:13:26.017 ], 00:13:26.017 "product_name": "Malloc disk", 00:13:26.017 "block_size": 512, 00:13:26.017 "num_blocks": 65536, 00:13:26.017 "uuid": "346d1e03-7026-4619-b803-a8f5135562db", 00:13:26.017 "assigned_rate_limits": { 00:13:26.017 "rw_ios_per_sec": 0, 00:13:26.017 "rw_mbytes_per_sec": 0, 00:13:26.017 "r_mbytes_per_sec": 0, 00:13:26.017 "w_mbytes_per_sec": 0 00:13:26.017 }, 00:13:26.017 "claimed": true, 00:13:26.017 "claim_type": "exclusive_write", 00:13:26.017 "zoned": false, 00:13:26.017 "supported_io_types": { 00:13:26.017 "read": true, 00:13:26.017 "write": true, 00:13:26.017 "unmap": true, 00:13:26.017 "flush": true, 00:13:26.017 "reset": true, 00:13:26.017 "nvme_admin": false, 00:13:26.017 "nvme_io": false, 00:13:26.017 "nvme_io_md": false, 00:13:26.017 "write_zeroes": true, 00:13:26.017 "zcopy": true, 00:13:26.017 "get_zone_info": false, 00:13:26.017 "zone_management": false, 00:13:26.017 "zone_append": false, 00:13:26.017 "compare": false, 00:13:26.017 "compare_and_write": false, 00:13:26.017 "abort": true, 00:13:26.017 "seek_hole": false, 00:13:26.017 "seek_data": false, 00:13:26.017 "copy": true, 00:13:26.017 "nvme_iov_md": false 00:13:26.017 }, 00:13:26.017 "memory_domains": [ 00:13:26.017 { 00:13:26.017 "dma_device_id": "system", 00:13:26.017 "dma_device_type": 1 00:13:26.017 }, 00:13:26.017 { 00:13:26.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.017 "dma_device_type": 2 00:13:26.017 } 00:13:26.017 ], 00:13:26.017 "driver_specific": {} 00:13:26.017 }' 00:13:26.017 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.017 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.017 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:26.017 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.017 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.017 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:26.017 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:26.276 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:26.276 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:26.276 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:26.276 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:26.276 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:26.276 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:26.534 [2024-07-12 13:39:14.943502] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:26.534 [2024-07-12 13:39:14.943529] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:26.534 [2024-07-12 13:39:14.943582] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:26.534 [2024-07-12 13:39:14.943634] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:26.534 [2024-07-12 13:39:14.943646] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16acec0 name Existed_Raid, state offline 00:13:26.534 13:39:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 447686 00:13:26.534 13:39:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 447686 ']' 00:13:26.534 13:39:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 447686 00:13:26.534 13:39:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:26.534 13:39:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:26.534 13:39:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 447686 00:13:26.534 13:39:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:26.534 13:39:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:26.534 13:39:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 447686' 00:13:26.534 killing process with pid 447686 00:13:26.534 13:39:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 447686 00:13:26.534 [2024-07-12 13:39:15.017527] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:26.534 13:39:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 447686 00:13:26.534 [2024-07-12 13:39:15.044736] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:26.794 00:13:26.794 real 0m32.155s 00:13:26.794 user 0m59.152s 00:13:26.794 sys 0m5.537s 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:26.794 ************************************ 00:13:26.794 END TEST raid_state_function_test 00:13:26.794 ************************************ 00:13:26.794 13:39:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:26.794 13:39:15 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:13:26.794 13:39:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:26.794 13:39:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:26.794 13:39:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:26.794 ************************************ 00:13:26.794 START TEST raid_state_function_test_sb 00:13:26.794 ************************************ 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=453001 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 453001' 00:13:26.794 Process raid pid: 453001 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 453001 /var/tmp/spdk-raid.sock 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 453001 ']' 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:26.794 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:26.794 13:39:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:27.054 [2024-07-12 13:39:15.425259] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:13:27.054 [2024-07-12 13:39:15.425326] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:27.054 [2024-07-12 13:39:15.557239] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:27.313 [2024-07-12 13:39:15.665739] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:27.313 [2024-07-12 13:39:15.729117] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:27.313 [2024-07-12 13:39:15.729151] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:27.880 13:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:27.880 13:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:27.880 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:28.139 [2024-07-12 13:39:16.580822] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:28.139 [2024-07-12 13:39:16.580862] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:28.139 [2024-07-12 13:39:16.580873] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:28.139 [2024-07-12 13:39:16.580885] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:28.139 [2024-07-12 13:39:16.580894] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:28.140 [2024-07-12 13:39:16.580905] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.140 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.398 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.398 "name": "Existed_Raid", 00:13:28.398 "uuid": "c9b5030b-423b-4469-8a98-f8e49d2d8fb8", 00:13:28.398 "strip_size_kb": 64, 00:13:28.398 "state": "configuring", 00:13:28.398 "raid_level": "raid0", 00:13:28.398 "superblock": true, 00:13:28.398 "num_base_bdevs": 3, 00:13:28.398 "num_base_bdevs_discovered": 0, 00:13:28.398 "num_base_bdevs_operational": 3, 00:13:28.398 "base_bdevs_list": [ 00:13:28.398 { 00:13:28.398 "name": "BaseBdev1", 00:13:28.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.398 "is_configured": false, 00:13:28.398 "data_offset": 0, 00:13:28.398 "data_size": 0 00:13:28.398 }, 00:13:28.398 { 00:13:28.398 "name": "BaseBdev2", 00:13:28.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.398 "is_configured": false, 00:13:28.398 "data_offset": 0, 00:13:28.398 "data_size": 0 00:13:28.398 }, 00:13:28.398 { 00:13:28.398 "name": "BaseBdev3", 00:13:28.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.398 "is_configured": false, 00:13:28.398 "data_offset": 0, 00:13:28.398 "data_size": 0 00:13:28.398 } 00:13:28.398 ] 00:13:28.398 }' 00:13:28.398 13:39:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.398 13:39:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:28.966 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:29.224 [2024-07-12 13:39:17.651499] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:29.224 [2024-07-12 13:39:17.651531] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e21350 name Existed_Raid, state configuring 00:13:29.224 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:29.483 [2024-07-12 13:39:17.836018] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:29.483 [2024-07-12 13:39:17.836049] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:29.483 [2024-07-12 13:39:17.836058] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:29.483 [2024-07-12 13:39:17.836070] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:29.483 [2024-07-12 13:39:17.836079] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:29.483 [2024-07-12 13:39:17.836090] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:29.483 13:39:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:29.483 [2024-07-12 13:39:18.030404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:29.483 BaseBdev1 00:13:29.483 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:29.483 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:29.483 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:29.483 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:29.483 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:29.483 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:29.483 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:29.742 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:30.000 [ 00:13:30.000 { 00:13:30.000 "name": "BaseBdev1", 00:13:30.000 "aliases": [ 00:13:30.000 "5affc8c7-251c-46e7-9b26-2a35cb4d8fb1" 00:13:30.000 ], 00:13:30.000 "product_name": "Malloc disk", 00:13:30.000 "block_size": 512, 00:13:30.000 "num_blocks": 65536, 00:13:30.000 "uuid": "5affc8c7-251c-46e7-9b26-2a35cb4d8fb1", 00:13:30.000 "assigned_rate_limits": { 00:13:30.000 "rw_ios_per_sec": 0, 00:13:30.000 "rw_mbytes_per_sec": 0, 00:13:30.000 "r_mbytes_per_sec": 0, 00:13:30.000 "w_mbytes_per_sec": 0 00:13:30.000 }, 00:13:30.000 "claimed": true, 00:13:30.000 "claim_type": "exclusive_write", 00:13:30.000 "zoned": false, 00:13:30.000 "supported_io_types": { 00:13:30.000 "read": true, 00:13:30.000 "write": true, 00:13:30.000 "unmap": true, 00:13:30.000 "flush": true, 00:13:30.000 "reset": true, 00:13:30.000 "nvme_admin": false, 00:13:30.000 "nvme_io": false, 00:13:30.000 "nvme_io_md": false, 00:13:30.000 "write_zeroes": true, 00:13:30.000 "zcopy": true, 00:13:30.000 "get_zone_info": false, 00:13:30.000 "zone_management": false, 00:13:30.000 "zone_append": false, 00:13:30.000 "compare": false, 00:13:30.000 "compare_and_write": false, 00:13:30.000 "abort": true, 00:13:30.000 "seek_hole": false, 00:13:30.000 "seek_data": false, 00:13:30.000 "copy": true, 00:13:30.000 "nvme_iov_md": false 00:13:30.000 }, 00:13:30.000 "memory_domains": [ 00:13:30.000 { 00:13:30.000 "dma_device_id": "system", 00:13:30.000 "dma_device_type": 1 00:13:30.000 }, 00:13:30.000 { 00:13:30.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.000 "dma_device_type": 2 00:13:30.000 } 00:13:30.000 ], 00:13:30.000 "driver_specific": {} 00:13:30.000 } 00:13:30.000 ] 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.000 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:30.258 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:30.258 "name": "Existed_Raid", 00:13:30.258 "uuid": "3a20005c-5fb3-4ccb-901a-d41e2f14d563", 00:13:30.258 "strip_size_kb": 64, 00:13:30.258 "state": "configuring", 00:13:30.258 "raid_level": "raid0", 00:13:30.258 "superblock": true, 00:13:30.258 "num_base_bdevs": 3, 00:13:30.258 "num_base_bdevs_discovered": 1, 00:13:30.258 "num_base_bdevs_operational": 3, 00:13:30.258 "base_bdevs_list": [ 00:13:30.258 { 00:13:30.259 "name": "BaseBdev1", 00:13:30.259 "uuid": "5affc8c7-251c-46e7-9b26-2a35cb4d8fb1", 00:13:30.259 "is_configured": true, 00:13:30.259 "data_offset": 2048, 00:13:30.259 "data_size": 63488 00:13:30.259 }, 00:13:30.259 { 00:13:30.259 "name": "BaseBdev2", 00:13:30.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:30.259 "is_configured": false, 00:13:30.259 "data_offset": 0, 00:13:30.259 "data_size": 0 00:13:30.259 }, 00:13:30.259 { 00:13:30.259 "name": "BaseBdev3", 00:13:30.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:30.259 "is_configured": false, 00:13:30.259 "data_offset": 0, 00:13:30.259 "data_size": 0 00:13:30.259 } 00:13:30.259 ] 00:13:30.259 }' 00:13:30.259 13:39:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:30.259 13:39:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:30.826 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:30.826 [2024-07-12 13:39:19.381991] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:30.826 [2024-07-12 13:39:19.382026] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e20c20 name Existed_Raid, state configuring 00:13:30.826 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:31.086 [2024-07-12 13:39:19.630696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:31.086 [2024-07-12 13:39:19.632173] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:31.086 [2024-07-12 13:39:19.632205] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:31.086 [2024-07-12 13:39:19.632215] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:31.086 [2024-07-12 13:39:19.632227] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.086 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.346 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.346 "name": "Existed_Raid", 00:13:31.346 "uuid": "bbf57dc8-1179-407d-8e93-78af86e1db06", 00:13:31.346 "strip_size_kb": 64, 00:13:31.346 "state": "configuring", 00:13:31.346 "raid_level": "raid0", 00:13:31.346 "superblock": true, 00:13:31.346 "num_base_bdevs": 3, 00:13:31.346 "num_base_bdevs_discovered": 1, 00:13:31.346 "num_base_bdevs_operational": 3, 00:13:31.346 "base_bdevs_list": [ 00:13:31.346 { 00:13:31.346 "name": "BaseBdev1", 00:13:31.346 "uuid": "5affc8c7-251c-46e7-9b26-2a35cb4d8fb1", 00:13:31.346 "is_configured": true, 00:13:31.346 "data_offset": 2048, 00:13:31.346 "data_size": 63488 00:13:31.346 }, 00:13:31.346 { 00:13:31.346 "name": "BaseBdev2", 00:13:31.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:31.346 "is_configured": false, 00:13:31.346 "data_offset": 0, 00:13:31.346 "data_size": 0 00:13:31.346 }, 00:13:31.346 { 00:13:31.346 "name": "BaseBdev3", 00:13:31.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:31.346 "is_configured": false, 00:13:31.346 "data_offset": 0, 00:13:31.346 "data_size": 0 00:13:31.346 } 00:13:31.346 ] 00:13:31.346 }' 00:13:31.346 13:39:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.346 13:39:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:32.284 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:32.284 [2024-07-12 13:39:20.676882] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:32.284 BaseBdev2 00:13:32.284 13:39:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:32.284 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:32.284 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:32.284 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:32.284 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:32.284 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:32.284 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:32.543 13:39:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:32.802 [ 00:13:32.802 { 00:13:32.802 "name": "BaseBdev2", 00:13:32.802 "aliases": [ 00:13:32.802 "a056f8b8-225d-4540-8fd4-b50cdeb2b2b4" 00:13:32.802 ], 00:13:32.802 "product_name": "Malloc disk", 00:13:32.802 "block_size": 512, 00:13:32.802 "num_blocks": 65536, 00:13:32.802 "uuid": "a056f8b8-225d-4540-8fd4-b50cdeb2b2b4", 00:13:32.802 "assigned_rate_limits": { 00:13:32.802 "rw_ios_per_sec": 0, 00:13:32.802 "rw_mbytes_per_sec": 0, 00:13:32.802 "r_mbytes_per_sec": 0, 00:13:32.802 "w_mbytes_per_sec": 0 00:13:32.802 }, 00:13:32.802 "claimed": true, 00:13:32.802 "claim_type": "exclusive_write", 00:13:32.802 "zoned": false, 00:13:32.802 "supported_io_types": { 00:13:32.802 "read": true, 00:13:32.803 "write": true, 00:13:32.803 "unmap": true, 00:13:32.803 "flush": true, 00:13:32.803 "reset": true, 00:13:32.803 "nvme_admin": false, 00:13:32.803 "nvme_io": false, 00:13:32.803 "nvme_io_md": false, 00:13:32.803 "write_zeroes": true, 00:13:32.803 "zcopy": true, 00:13:32.803 "get_zone_info": false, 00:13:32.803 "zone_management": false, 00:13:32.803 "zone_append": false, 00:13:32.803 "compare": false, 00:13:32.803 "compare_and_write": false, 00:13:32.803 "abort": true, 00:13:32.803 "seek_hole": false, 00:13:32.803 "seek_data": false, 00:13:32.803 "copy": true, 00:13:32.803 "nvme_iov_md": false 00:13:32.803 }, 00:13:32.803 "memory_domains": [ 00:13:32.803 { 00:13:32.803 "dma_device_id": "system", 00:13:32.803 "dma_device_type": 1 00:13:32.803 }, 00:13:32.803 { 00:13:32.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.803 "dma_device_type": 2 00:13:32.803 } 00:13:32.803 ], 00:13:32.803 "driver_specific": {} 00:13:32.803 } 00:13:32.803 ] 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.803 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:33.062 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:33.062 "name": "Existed_Raid", 00:13:33.062 "uuid": "bbf57dc8-1179-407d-8e93-78af86e1db06", 00:13:33.062 "strip_size_kb": 64, 00:13:33.062 "state": "configuring", 00:13:33.062 "raid_level": "raid0", 00:13:33.062 "superblock": true, 00:13:33.062 "num_base_bdevs": 3, 00:13:33.062 "num_base_bdevs_discovered": 2, 00:13:33.062 "num_base_bdevs_operational": 3, 00:13:33.062 "base_bdevs_list": [ 00:13:33.062 { 00:13:33.062 "name": "BaseBdev1", 00:13:33.062 "uuid": "5affc8c7-251c-46e7-9b26-2a35cb4d8fb1", 00:13:33.062 "is_configured": true, 00:13:33.062 "data_offset": 2048, 00:13:33.062 "data_size": 63488 00:13:33.062 }, 00:13:33.062 { 00:13:33.062 "name": "BaseBdev2", 00:13:33.062 "uuid": "a056f8b8-225d-4540-8fd4-b50cdeb2b2b4", 00:13:33.062 "is_configured": true, 00:13:33.062 "data_offset": 2048, 00:13:33.062 "data_size": 63488 00:13:33.062 }, 00:13:33.062 { 00:13:33.062 "name": "BaseBdev3", 00:13:33.062 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:33.062 "is_configured": false, 00:13:33.062 "data_offset": 0, 00:13:33.062 "data_size": 0 00:13:33.062 } 00:13:33.062 ] 00:13:33.062 }' 00:13:33.062 13:39:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:33.062 13:39:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:33.631 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:34.200 [2024-07-12 13:39:22.518349] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:34.200 [2024-07-12 13:39:22.518508] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e21b10 00:13:34.200 [2024-07-12 13:39:22.518522] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:34.200 [2024-07-12 13:39:22.518691] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e217e0 00:13:34.200 [2024-07-12 13:39:22.518810] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e21b10 00:13:34.200 [2024-07-12 13:39:22.518820] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e21b10 00:13:34.200 [2024-07-12 13:39:22.518909] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:34.200 BaseBdev3 00:13:34.200 13:39:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:34.200 13:39:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:34.200 13:39:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:34.200 13:39:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:34.200 13:39:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:34.200 13:39:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:34.200 13:39:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:34.459 13:39:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:34.459 [ 00:13:34.459 { 00:13:34.459 "name": "BaseBdev3", 00:13:34.459 "aliases": [ 00:13:34.459 "a1faffd0-525b-4b6e-a68d-17ce1bde4c63" 00:13:34.459 ], 00:13:34.459 "product_name": "Malloc disk", 00:13:34.459 "block_size": 512, 00:13:34.459 "num_blocks": 65536, 00:13:34.459 "uuid": "a1faffd0-525b-4b6e-a68d-17ce1bde4c63", 00:13:34.459 "assigned_rate_limits": { 00:13:34.459 "rw_ios_per_sec": 0, 00:13:34.459 "rw_mbytes_per_sec": 0, 00:13:34.459 "r_mbytes_per_sec": 0, 00:13:34.459 "w_mbytes_per_sec": 0 00:13:34.459 }, 00:13:34.459 "claimed": true, 00:13:34.459 "claim_type": "exclusive_write", 00:13:34.459 "zoned": false, 00:13:34.459 "supported_io_types": { 00:13:34.459 "read": true, 00:13:34.459 "write": true, 00:13:34.459 "unmap": true, 00:13:34.459 "flush": true, 00:13:34.459 "reset": true, 00:13:34.459 "nvme_admin": false, 00:13:34.459 "nvme_io": false, 00:13:34.459 "nvme_io_md": false, 00:13:34.459 "write_zeroes": true, 00:13:34.459 "zcopy": true, 00:13:34.459 "get_zone_info": false, 00:13:34.459 "zone_management": false, 00:13:34.459 "zone_append": false, 00:13:34.459 "compare": false, 00:13:34.459 "compare_and_write": false, 00:13:34.459 "abort": true, 00:13:34.459 "seek_hole": false, 00:13:34.459 "seek_data": false, 00:13:34.459 "copy": true, 00:13:34.459 "nvme_iov_md": false 00:13:34.459 }, 00:13:34.459 "memory_domains": [ 00:13:34.459 { 00:13:34.459 "dma_device_id": "system", 00:13:34.459 "dma_device_type": 1 00:13:34.459 }, 00:13:34.459 { 00:13:34.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:34.460 "dma_device_type": 2 00:13:34.460 } 00:13:34.460 ], 00:13:34.460 "driver_specific": {} 00:13:34.460 } 00:13:34.460 ] 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:34.718 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.718 "name": "Existed_Raid", 00:13:34.718 "uuid": "bbf57dc8-1179-407d-8e93-78af86e1db06", 00:13:34.718 "strip_size_kb": 64, 00:13:34.718 "state": "online", 00:13:34.718 "raid_level": "raid0", 00:13:34.718 "superblock": true, 00:13:34.718 "num_base_bdevs": 3, 00:13:34.718 "num_base_bdevs_discovered": 3, 00:13:34.718 "num_base_bdevs_operational": 3, 00:13:34.718 "base_bdevs_list": [ 00:13:34.718 { 00:13:34.718 "name": "BaseBdev1", 00:13:34.718 "uuid": "5affc8c7-251c-46e7-9b26-2a35cb4d8fb1", 00:13:34.718 "is_configured": true, 00:13:34.718 "data_offset": 2048, 00:13:34.718 "data_size": 63488 00:13:34.718 }, 00:13:34.718 { 00:13:34.718 "name": "BaseBdev2", 00:13:34.718 "uuid": "a056f8b8-225d-4540-8fd4-b50cdeb2b2b4", 00:13:34.719 "is_configured": true, 00:13:34.719 "data_offset": 2048, 00:13:34.719 "data_size": 63488 00:13:34.719 }, 00:13:34.719 { 00:13:34.719 "name": "BaseBdev3", 00:13:34.719 "uuid": "a1faffd0-525b-4b6e-a68d-17ce1bde4c63", 00:13:34.719 "is_configured": true, 00:13:34.719 "data_offset": 2048, 00:13:34.719 "data_size": 63488 00:13:34.719 } 00:13:34.719 ] 00:13:34.719 }' 00:13:34.719 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.719 13:39:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:35.655 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:35.655 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:35.655 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:35.655 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:35.655 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:35.655 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:35.655 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:35.655 13:39:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:35.655 [2024-07-12 13:39:24.110867] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:35.655 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:35.655 "name": "Existed_Raid", 00:13:35.655 "aliases": [ 00:13:35.655 "bbf57dc8-1179-407d-8e93-78af86e1db06" 00:13:35.655 ], 00:13:35.655 "product_name": "Raid Volume", 00:13:35.655 "block_size": 512, 00:13:35.655 "num_blocks": 190464, 00:13:35.655 "uuid": "bbf57dc8-1179-407d-8e93-78af86e1db06", 00:13:35.655 "assigned_rate_limits": { 00:13:35.655 "rw_ios_per_sec": 0, 00:13:35.655 "rw_mbytes_per_sec": 0, 00:13:35.655 "r_mbytes_per_sec": 0, 00:13:35.655 "w_mbytes_per_sec": 0 00:13:35.655 }, 00:13:35.655 "claimed": false, 00:13:35.655 "zoned": false, 00:13:35.655 "supported_io_types": { 00:13:35.655 "read": true, 00:13:35.655 "write": true, 00:13:35.655 "unmap": true, 00:13:35.655 "flush": true, 00:13:35.655 "reset": true, 00:13:35.655 "nvme_admin": false, 00:13:35.655 "nvme_io": false, 00:13:35.655 "nvme_io_md": false, 00:13:35.655 "write_zeroes": true, 00:13:35.655 "zcopy": false, 00:13:35.655 "get_zone_info": false, 00:13:35.655 "zone_management": false, 00:13:35.655 "zone_append": false, 00:13:35.655 "compare": false, 00:13:35.655 "compare_and_write": false, 00:13:35.655 "abort": false, 00:13:35.655 "seek_hole": false, 00:13:35.655 "seek_data": false, 00:13:35.655 "copy": false, 00:13:35.655 "nvme_iov_md": false 00:13:35.655 }, 00:13:35.655 "memory_domains": [ 00:13:35.655 { 00:13:35.655 "dma_device_id": "system", 00:13:35.655 "dma_device_type": 1 00:13:35.655 }, 00:13:35.655 { 00:13:35.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.655 "dma_device_type": 2 00:13:35.655 }, 00:13:35.655 { 00:13:35.655 "dma_device_id": "system", 00:13:35.655 "dma_device_type": 1 00:13:35.655 }, 00:13:35.655 { 00:13:35.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.655 "dma_device_type": 2 00:13:35.655 }, 00:13:35.655 { 00:13:35.655 "dma_device_id": "system", 00:13:35.655 "dma_device_type": 1 00:13:35.655 }, 00:13:35.655 { 00:13:35.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.655 "dma_device_type": 2 00:13:35.655 } 00:13:35.655 ], 00:13:35.655 "driver_specific": { 00:13:35.655 "raid": { 00:13:35.655 "uuid": "bbf57dc8-1179-407d-8e93-78af86e1db06", 00:13:35.655 "strip_size_kb": 64, 00:13:35.655 "state": "online", 00:13:35.655 "raid_level": "raid0", 00:13:35.655 "superblock": true, 00:13:35.655 "num_base_bdevs": 3, 00:13:35.655 "num_base_bdevs_discovered": 3, 00:13:35.655 "num_base_bdevs_operational": 3, 00:13:35.655 "base_bdevs_list": [ 00:13:35.655 { 00:13:35.655 "name": "BaseBdev1", 00:13:35.655 "uuid": "5affc8c7-251c-46e7-9b26-2a35cb4d8fb1", 00:13:35.655 "is_configured": true, 00:13:35.655 "data_offset": 2048, 00:13:35.655 "data_size": 63488 00:13:35.655 }, 00:13:35.655 { 00:13:35.655 "name": "BaseBdev2", 00:13:35.655 "uuid": "a056f8b8-225d-4540-8fd4-b50cdeb2b2b4", 00:13:35.655 "is_configured": true, 00:13:35.655 "data_offset": 2048, 00:13:35.655 "data_size": 63488 00:13:35.655 }, 00:13:35.655 { 00:13:35.655 "name": "BaseBdev3", 00:13:35.655 "uuid": "a1faffd0-525b-4b6e-a68d-17ce1bde4c63", 00:13:35.655 "is_configured": true, 00:13:35.655 "data_offset": 2048, 00:13:35.655 "data_size": 63488 00:13:35.655 } 00:13:35.655 ] 00:13:35.655 } 00:13:35.655 } 00:13:35.655 }' 00:13:35.655 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:35.655 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:35.655 BaseBdev2 00:13:35.655 BaseBdev3' 00:13:35.655 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:35.655 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:35.655 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:35.914 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:35.914 "name": "BaseBdev1", 00:13:35.914 "aliases": [ 00:13:35.914 "5affc8c7-251c-46e7-9b26-2a35cb4d8fb1" 00:13:35.914 ], 00:13:35.914 "product_name": "Malloc disk", 00:13:35.914 "block_size": 512, 00:13:35.914 "num_blocks": 65536, 00:13:35.914 "uuid": "5affc8c7-251c-46e7-9b26-2a35cb4d8fb1", 00:13:35.914 "assigned_rate_limits": { 00:13:35.914 "rw_ios_per_sec": 0, 00:13:35.914 "rw_mbytes_per_sec": 0, 00:13:35.914 "r_mbytes_per_sec": 0, 00:13:35.914 "w_mbytes_per_sec": 0 00:13:35.914 }, 00:13:35.914 "claimed": true, 00:13:35.914 "claim_type": "exclusive_write", 00:13:35.914 "zoned": false, 00:13:35.914 "supported_io_types": { 00:13:35.914 "read": true, 00:13:35.914 "write": true, 00:13:35.914 "unmap": true, 00:13:35.914 "flush": true, 00:13:35.914 "reset": true, 00:13:35.914 "nvme_admin": false, 00:13:35.914 "nvme_io": false, 00:13:35.914 "nvme_io_md": false, 00:13:35.914 "write_zeroes": true, 00:13:35.914 "zcopy": true, 00:13:35.914 "get_zone_info": false, 00:13:35.914 "zone_management": false, 00:13:35.914 "zone_append": false, 00:13:35.914 "compare": false, 00:13:35.914 "compare_and_write": false, 00:13:35.914 "abort": true, 00:13:35.914 "seek_hole": false, 00:13:35.914 "seek_data": false, 00:13:35.914 "copy": true, 00:13:35.914 "nvme_iov_md": false 00:13:35.914 }, 00:13:35.914 "memory_domains": [ 00:13:35.914 { 00:13:35.914 "dma_device_id": "system", 00:13:35.914 "dma_device_type": 1 00:13:35.914 }, 00:13:35.914 { 00:13:35.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.914 "dma_device_type": 2 00:13:35.914 } 00:13:35.914 ], 00:13:35.914 "driver_specific": {} 00:13:35.914 }' 00:13:35.914 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:35.914 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.174 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:36.174 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.174 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.174 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:36.174 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.174 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.174 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:36.174 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.174 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.433 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.433 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:36.433 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:36.433 13:39:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:36.692 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:36.692 "name": "BaseBdev2", 00:13:36.692 "aliases": [ 00:13:36.692 "a056f8b8-225d-4540-8fd4-b50cdeb2b2b4" 00:13:36.692 ], 00:13:36.692 "product_name": "Malloc disk", 00:13:36.692 "block_size": 512, 00:13:36.692 "num_blocks": 65536, 00:13:36.692 "uuid": "a056f8b8-225d-4540-8fd4-b50cdeb2b2b4", 00:13:36.692 "assigned_rate_limits": { 00:13:36.692 "rw_ios_per_sec": 0, 00:13:36.692 "rw_mbytes_per_sec": 0, 00:13:36.692 "r_mbytes_per_sec": 0, 00:13:36.692 "w_mbytes_per_sec": 0 00:13:36.692 }, 00:13:36.692 "claimed": true, 00:13:36.692 "claim_type": "exclusive_write", 00:13:36.692 "zoned": false, 00:13:36.692 "supported_io_types": { 00:13:36.692 "read": true, 00:13:36.692 "write": true, 00:13:36.692 "unmap": true, 00:13:36.692 "flush": true, 00:13:36.692 "reset": true, 00:13:36.692 "nvme_admin": false, 00:13:36.692 "nvme_io": false, 00:13:36.692 "nvme_io_md": false, 00:13:36.692 "write_zeroes": true, 00:13:36.692 "zcopy": true, 00:13:36.692 "get_zone_info": false, 00:13:36.692 "zone_management": false, 00:13:36.692 "zone_append": false, 00:13:36.692 "compare": false, 00:13:36.692 "compare_and_write": false, 00:13:36.692 "abort": true, 00:13:36.692 "seek_hole": false, 00:13:36.692 "seek_data": false, 00:13:36.692 "copy": true, 00:13:36.692 "nvme_iov_md": false 00:13:36.692 }, 00:13:36.692 "memory_domains": [ 00:13:36.692 { 00:13:36.692 "dma_device_id": "system", 00:13:36.692 "dma_device_type": 1 00:13:36.692 }, 00:13:36.692 { 00:13:36.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.692 "dma_device_type": 2 00:13:36.692 } 00:13:36.692 ], 00:13:36.692 "driver_specific": {} 00:13:36.692 }' 00:13:36.692 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.692 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.692 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:36.692 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.692 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.692 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:36.692 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.692 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.951 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:36.951 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.951 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.951 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.951 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:36.951 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:36.951 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:37.211 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:37.211 "name": "BaseBdev3", 00:13:37.211 "aliases": [ 00:13:37.211 "a1faffd0-525b-4b6e-a68d-17ce1bde4c63" 00:13:37.211 ], 00:13:37.211 "product_name": "Malloc disk", 00:13:37.211 "block_size": 512, 00:13:37.211 "num_blocks": 65536, 00:13:37.211 "uuid": "a1faffd0-525b-4b6e-a68d-17ce1bde4c63", 00:13:37.211 "assigned_rate_limits": { 00:13:37.211 "rw_ios_per_sec": 0, 00:13:37.211 "rw_mbytes_per_sec": 0, 00:13:37.211 "r_mbytes_per_sec": 0, 00:13:37.211 "w_mbytes_per_sec": 0 00:13:37.211 }, 00:13:37.211 "claimed": true, 00:13:37.211 "claim_type": "exclusive_write", 00:13:37.211 "zoned": false, 00:13:37.211 "supported_io_types": { 00:13:37.211 "read": true, 00:13:37.211 "write": true, 00:13:37.211 "unmap": true, 00:13:37.211 "flush": true, 00:13:37.211 "reset": true, 00:13:37.211 "nvme_admin": false, 00:13:37.211 "nvme_io": false, 00:13:37.211 "nvme_io_md": false, 00:13:37.211 "write_zeroes": true, 00:13:37.211 "zcopy": true, 00:13:37.211 "get_zone_info": false, 00:13:37.211 "zone_management": false, 00:13:37.211 "zone_append": false, 00:13:37.211 "compare": false, 00:13:37.211 "compare_and_write": false, 00:13:37.211 "abort": true, 00:13:37.211 "seek_hole": false, 00:13:37.211 "seek_data": false, 00:13:37.211 "copy": true, 00:13:37.211 "nvme_iov_md": false 00:13:37.211 }, 00:13:37.211 "memory_domains": [ 00:13:37.211 { 00:13:37.211 "dma_device_id": "system", 00:13:37.211 "dma_device_type": 1 00:13:37.211 }, 00:13:37.211 { 00:13:37.211 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.211 "dma_device_type": 2 00:13:37.211 } 00:13:37.211 ], 00:13:37.211 "driver_specific": {} 00:13:37.211 }' 00:13:37.211 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.211 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.211 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:37.211 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.211 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.211 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:37.470 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.470 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.470 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:37.470 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.470 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.470 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:37.470 13:39:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:37.729 [2024-07-12 13:39:26.200213] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:37.729 [2024-07-12 13:39:26.200237] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:37.729 [2024-07-12 13:39:26.200276] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.729 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:37.988 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.988 "name": "Existed_Raid", 00:13:37.988 "uuid": "bbf57dc8-1179-407d-8e93-78af86e1db06", 00:13:37.988 "strip_size_kb": 64, 00:13:37.988 "state": "offline", 00:13:37.988 "raid_level": "raid0", 00:13:37.988 "superblock": true, 00:13:37.988 "num_base_bdevs": 3, 00:13:37.988 "num_base_bdevs_discovered": 2, 00:13:37.988 "num_base_bdevs_operational": 2, 00:13:37.988 "base_bdevs_list": [ 00:13:37.988 { 00:13:37.988 "name": null, 00:13:37.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:37.988 "is_configured": false, 00:13:37.988 "data_offset": 2048, 00:13:37.988 "data_size": 63488 00:13:37.988 }, 00:13:37.988 { 00:13:37.988 "name": "BaseBdev2", 00:13:37.988 "uuid": "a056f8b8-225d-4540-8fd4-b50cdeb2b2b4", 00:13:37.988 "is_configured": true, 00:13:37.988 "data_offset": 2048, 00:13:37.988 "data_size": 63488 00:13:37.988 }, 00:13:37.988 { 00:13:37.988 "name": "BaseBdev3", 00:13:37.988 "uuid": "a1faffd0-525b-4b6e-a68d-17ce1bde4c63", 00:13:37.988 "is_configured": true, 00:13:37.988 "data_offset": 2048, 00:13:37.988 "data_size": 63488 00:13:37.988 } 00:13:37.988 ] 00:13:37.988 }' 00:13:37.988 13:39:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.988 13:39:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:38.555 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:38.555 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:38.555 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:38.555 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.813 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:38.813 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:38.813 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:39.071 [2024-07-12 13:39:27.548777] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:39.071 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:39.071 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:39.071 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.071 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:39.329 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:39.329 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:39.329 13:39:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:39.587 [2024-07-12 13:39:28.054534] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:39.587 [2024-07-12 13:39:28.054575] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e21b10 name Existed_Raid, state offline 00:13:39.587 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:39.587 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:39.587 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.587 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:39.845 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:39.845 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:39.845 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:39.845 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:39.845 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:39.845 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:40.104 BaseBdev2 00:13:40.104 13:39:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:40.104 13:39:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:40.104 13:39:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:40.104 13:39:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:40.104 13:39:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:40.104 13:39:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:40.104 13:39:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:40.362 13:39:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:40.621 [ 00:13:40.621 { 00:13:40.621 "name": "BaseBdev2", 00:13:40.621 "aliases": [ 00:13:40.621 "1e89daef-5c1b-49de-b65d-5c2f0281766c" 00:13:40.621 ], 00:13:40.621 "product_name": "Malloc disk", 00:13:40.621 "block_size": 512, 00:13:40.621 "num_blocks": 65536, 00:13:40.621 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:40.621 "assigned_rate_limits": { 00:13:40.621 "rw_ios_per_sec": 0, 00:13:40.621 "rw_mbytes_per_sec": 0, 00:13:40.621 "r_mbytes_per_sec": 0, 00:13:40.621 "w_mbytes_per_sec": 0 00:13:40.621 }, 00:13:40.621 "claimed": false, 00:13:40.621 "zoned": false, 00:13:40.621 "supported_io_types": { 00:13:40.621 "read": true, 00:13:40.621 "write": true, 00:13:40.621 "unmap": true, 00:13:40.621 "flush": true, 00:13:40.621 "reset": true, 00:13:40.621 "nvme_admin": false, 00:13:40.621 "nvme_io": false, 00:13:40.621 "nvme_io_md": false, 00:13:40.621 "write_zeroes": true, 00:13:40.621 "zcopy": true, 00:13:40.621 "get_zone_info": false, 00:13:40.621 "zone_management": false, 00:13:40.621 "zone_append": false, 00:13:40.621 "compare": false, 00:13:40.621 "compare_and_write": false, 00:13:40.621 "abort": true, 00:13:40.621 "seek_hole": false, 00:13:40.621 "seek_data": false, 00:13:40.622 "copy": true, 00:13:40.622 "nvme_iov_md": false 00:13:40.622 }, 00:13:40.622 "memory_domains": [ 00:13:40.622 { 00:13:40.622 "dma_device_id": "system", 00:13:40.622 "dma_device_type": 1 00:13:40.622 }, 00:13:40.622 { 00:13:40.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.622 "dma_device_type": 2 00:13:40.622 } 00:13:40.622 ], 00:13:40.622 "driver_specific": {} 00:13:40.622 } 00:13:40.622 ] 00:13:40.622 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:40.622 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:40.622 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:40.622 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:40.879 BaseBdev3 00:13:40.879 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:40.879 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:40.879 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:40.879 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:40.879 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:40.879 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:40.880 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:41.138 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:41.396 [ 00:13:41.396 { 00:13:41.396 "name": "BaseBdev3", 00:13:41.396 "aliases": [ 00:13:41.396 "10882be9-e1a7-4a75-9036-f43dfc686b6f" 00:13:41.396 ], 00:13:41.396 "product_name": "Malloc disk", 00:13:41.396 "block_size": 512, 00:13:41.396 "num_blocks": 65536, 00:13:41.396 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:41.396 "assigned_rate_limits": { 00:13:41.396 "rw_ios_per_sec": 0, 00:13:41.397 "rw_mbytes_per_sec": 0, 00:13:41.397 "r_mbytes_per_sec": 0, 00:13:41.397 "w_mbytes_per_sec": 0 00:13:41.397 }, 00:13:41.397 "claimed": false, 00:13:41.397 "zoned": false, 00:13:41.397 "supported_io_types": { 00:13:41.397 "read": true, 00:13:41.397 "write": true, 00:13:41.397 "unmap": true, 00:13:41.397 "flush": true, 00:13:41.397 "reset": true, 00:13:41.397 "nvme_admin": false, 00:13:41.397 "nvme_io": false, 00:13:41.397 "nvme_io_md": false, 00:13:41.397 "write_zeroes": true, 00:13:41.397 "zcopy": true, 00:13:41.397 "get_zone_info": false, 00:13:41.397 "zone_management": false, 00:13:41.397 "zone_append": false, 00:13:41.397 "compare": false, 00:13:41.397 "compare_and_write": false, 00:13:41.397 "abort": true, 00:13:41.397 "seek_hole": false, 00:13:41.397 "seek_data": false, 00:13:41.397 "copy": true, 00:13:41.397 "nvme_iov_md": false 00:13:41.397 }, 00:13:41.397 "memory_domains": [ 00:13:41.397 { 00:13:41.397 "dma_device_id": "system", 00:13:41.397 "dma_device_type": 1 00:13:41.397 }, 00:13:41.397 { 00:13:41.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.397 "dma_device_type": 2 00:13:41.397 } 00:13:41.397 ], 00:13:41.397 "driver_specific": {} 00:13:41.397 } 00:13:41.397 ] 00:13:41.397 13:39:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:41.397 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:41.397 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:41.397 13:39:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:41.656 [2024-07-12 13:39:30.032183] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:41.656 [2024-07-12 13:39:30.032227] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:41.656 [2024-07-12 13:39:30.032248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:41.656 [2024-07-12 13:39:30.033627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.656 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:41.914 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.914 "name": "Existed_Raid", 00:13:41.914 "uuid": "59e16bc6-a490-4b7a-9964-dd6397ce5985", 00:13:41.914 "strip_size_kb": 64, 00:13:41.914 "state": "configuring", 00:13:41.914 "raid_level": "raid0", 00:13:41.914 "superblock": true, 00:13:41.914 "num_base_bdevs": 3, 00:13:41.914 "num_base_bdevs_discovered": 2, 00:13:41.914 "num_base_bdevs_operational": 3, 00:13:41.914 "base_bdevs_list": [ 00:13:41.914 { 00:13:41.914 "name": "BaseBdev1", 00:13:41.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:41.914 "is_configured": false, 00:13:41.914 "data_offset": 0, 00:13:41.914 "data_size": 0 00:13:41.914 }, 00:13:41.914 { 00:13:41.914 "name": "BaseBdev2", 00:13:41.914 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:41.914 "is_configured": true, 00:13:41.914 "data_offset": 2048, 00:13:41.914 "data_size": 63488 00:13:41.914 }, 00:13:41.914 { 00:13:41.914 "name": "BaseBdev3", 00:13:41.915 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:41.915 "is_configured": true, 00:13:41.915 "data_offset": 2048, 00:13:41.915 "data_size": 63488 00:13:41.915 } 00:13:41.915 ] 00:13:41.915 }' 00:13:41.915 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.915 13:39:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:42.482 13:39:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:42.482 [2024-07-12 13:39:31.050839] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.741 "name": "Existed_Raid", 00:13:42.741 "uuid": "59e16bc6-a490-4b7a-9964-dd6397ce5985", 00:13:42.741 "strip_size_kb": 64, 00:13:42.741 "state": "configuring", 00:13:42.741 "raid_level": "raid0", 00:13:42.741 "superblock": true, 00:13:42.741 "num_base_bdevs": 3, 00:13:42.741 "num_base_bdevs_discovered": 1, 00:13:42.741 "num_base_bdevs_operational": 3, 00:13:42.741 "base_bdevs_list": [ 00:13:42.741 { 00:13:42.741 "name": "BaseBdev1", 00:13:42.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:42.741 "is_configured": false, 00:13:42.741 "data_offset": 0, 00:13:42.741 "data_size": 0 00:13:42.741 }, 00:13:42.741 { 00:13:42.741 "name": null, 00:13:42.741 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:42.741 "is_configured": false, 00:13:42.741 "data_offset": 2048, 00:13:42.741 "data_size": 63488 00:13:42.741 }, 00:13:42.741 { 00:13:42.741 "name": "BaseBdev3", 00:13:42.741 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:42.741 "is_configured": true, 00:13:42.741 "data_offset": 2048, 00:13:42.741 "data_size": 63488 00:13:42.741 } 00:13:42.741 ] 00:13:42.741 }' 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.741 13:39:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:43.310 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:43.310 13:39:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.569 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:43.569 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:43.828 [2024-07-12 13:39:32.241555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:43.828 BaseBdev1 00:13:43.828 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:43.828 13:39:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:43.828 13:39:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:43.828 13:39:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:43.828 13:39:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:43.828 13:39:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:43.828 13:39:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:44.088 13:39:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:44.347 [ 00:13:44.347 { 00:13:44.347 "name": "BaseBdev1", 00:13:44.347 "aliases": [ 00:13:44.347 "727777cf-0e0d-4817-9481-32943063c204" 00:13:44.347 ], 00:13:44.347 "product_name": "Malloc disk", 00:13:44.347 "block_size": 512, 00:13:44.347 "num_blocks": 65536, 00:13:44.347 "uuid": "727777cf-0e0d-4817-9481-32943063c204", 00:13:44.347 "assigned_rate_limits": { 00:13:44.347 "rw_ios_per_sec": 0, 00:13:44.347 "rw_mbytes_per_sec": 0, 00:13:44.347 "r_mbytes_per_sec": 0, 00:13:44.347 "w_mbytes_per_sec": 0 00:13:44.347 }, 00:13:44.347 "claimed": true, 00:13:44.347 "claim_type": "exclusive_write", 00:13:44.347 "zoned": false, 00:13:44.347 "supported_io_types": { 00:13:44.347 "read": true, 00:13:44.347 "write": true, 00:13:44.347 "unmap": true, 00:13:44.347 "flush": true, 00:13:44.347 "reset": true, 00:13:44.347 "nvme_admin": false, 00:13:44.347 "nvme_io": false, 00:13:44.347 "nvme_io_md": false, 00:13:44.347 "write_zeroes": true, 00:13:44.347 "zcopy": true, 00:13:44.347 "get_zone_info": false, 00:13:44.347 "zone_management": false, 00:13:44.347 "zone_append": false, 00:13:44.347 "compare": false, 00:13:44.347 "compare_and_write": false, 00:13:44.347 "abort": true, 00:13:44.347 "seek_hole": false, 00:13:44.347 "seek_data": false, 00:13:44.347 "copy": true, 00:13:44.347 "nvme_iov_md": false 00:13:44.347 }, 00:13:44.347 "memory_domains": [ 00:13:44.347 { 00:13:44.347 "dma_device_id": "system", 00:13:44.347 "dma_device_type": 1 00:13:44.347 }, 00:13:44.347 { 00:13:44.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:44.347 "dma_device_type": 2 00:13:44.347 } 00:13:44.347 ], 00:13:44.347 "driver_specific": {} 00:13:44.347 } 00:13:44.347 ] 00:13:44.347 13:39:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:44.347 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:44.347 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:44.347 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:44.347 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:44.348 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:44.348 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:44.348 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.348 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.348 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.348 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.348 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.348 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.605 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.605 "name": "Existed_Raid", 00:13:44.605 "uuid": "59e16bc6-a490-4b7a-9964-dd6397ce5985", 00:13:44.605 "strip_size_kb": 64, 00:13:44.605 "state": "configuring", 00:13:44.605 "raid_level": "raid0", 00:13:44.605 "superblock": true, 00:13:44.605 "num_base_bdevs": 3, 00:13:44.605 "num_base_bdevs_discovered": 2, 00:13:44.605 "num_base_bdevs_operational": 3, 00:13:44.605 "base_bdevs_list": [ 00:13:44.605 { 00:13:44.605 "name": "BaseBdev1", 00:13:44.605 "uuid": "727777cf-0e0d-4817-9481-32943063c204", 00:13:44.605 "is_configured": true, 00:13:44.605 "data_offset": 2048, 00:13:44.605 "data_size": 63488 00:13:44.605 }, 00:13:44.605 { 00:13:44.605 "name": null, 00:13:44.605 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:44.605 "is_configured": false, 00:13:44.605 "data_offset": 2048, 00:13:44.605 "data_size": 63488 00:13:44.605 }, 00:13:44.605 { 00:13:44.605 "name": "BaseBdev3", 00:13:44.605 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:44.605 "is_configured": true, 00:13:44.605 "data_offset": 2048, 00:13:44.605 "data_size": 63488 00:13:44.605 } 00:13:44.605 ] 00:13:44.605 }' 00:13:44.605 13:39:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.605 13:39:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:45.172 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.172 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:45.431 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:45.431 13:39:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:45.691 [2024-07-12 13:39:34.054376] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.691 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.951 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.951 "name": "Existed_Raid", 00:13:45.951 "uuid": "59e16bc6-a490-4b7a-9964-dd6397ce5985", 00:13:45.951 "strip_size_kb": 64, 00:13:45.951 "state": "configuring", 00:13:45.951 "raid_level": "raid0", 00:13:45.951 "superblock": true, 00:13:45.951 "num_base_bdevs": 3, 00:13:45.951 "num_base_bdevs_discovered": 1, 00:13:45.951 "num_base_bdevs_operational": 3, 00:13:45.951 "base_bdevs_list": [ 00:13:45.951 { 00:13:45.951 "name": "BaseBdev1", 00:13:45.951 "uuid": "727777cf-0e0d-4817-9481-32943063c204", 00:13:45.951 "is_configured": true, 00:13:45.951 "data_offset": 2048, 00:13:45.951 "data_size": 63488 00:13:45.951 }, 00:13:45.951 { 00:13:45.951 "name": null, 00:13:45.951 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:45.951 "is_configured": false, 00:13:45.951 "data_offset": 2048, 00:13:45.951 "data_size": 63488 00:13:45.951 }, 00:13:45.951 { 00:13:45.951 "name": null, 00:13:45.951 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:45.951 "is_configured": false, 00:13:45.951 "data_offset": 2048, 00:13:45.951 "data_size": 63488 00:13:45.951 } 00:13:45.951 ] 00:13:45.951 }' 00:13:45.951 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.951 13:39:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:46.520 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.520 13:39:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:46.520 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:46.520 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:46.779 [2024-07-12 13:39:35.285655] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.779 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.038 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.038 "name": "Existed_Raid", 00:13:47.038 "uuid": "59e16bc6-a490-4b7a-9964-dd6397ce5985", 00:13:47.038 "strip_size_kb": 64, 00:13:47.038 "state": "configuring", 00:13:47.038 "raid_level": "raid0", 00:13:47.038 "superblock": true, 00:13:47.038 "num_base_bdevs": 3, 00:13:47.038 "num_base_bdevs_discovered": 2, 00:13:47.038 "num_base_bdevs_operational": 3, 00:13:47.038 "base_bdevs_list": [ 00:13:47.038 { 00:13:47.038 "name": "BaseBdev1", 00:13:47.038 "uuid": "727777cf-0e0d-4817-9481-32943063c204", 00:13:47.038 "is_configured": true, 00:13:47.038 "data_offset": 2048, 00:13:47.038 "data_size": 63488 00:13:47.038 }, 00:13:47.038 { 00:13:47.038 "name": null, 00:13:47.038 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:47.038 "is_configured": false, 00:13:47.038 "data_offset": 2048, 00:13:47.038 "data_size": 63488 00:13:47.038 }, 00:13:47.038 { 00:13:47.038 "name": "BaseBdev3", 00:13:47.038 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:47.038 "is_configured": true, 00:13:47.038 "data_offset": 2048, 00:13:47.038 "data_size": 63488 00:13:47.038 } 00:13:47.038 ] 00:13:47.038 }' 00:13:47.038 13:39:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.038 13:39:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:47.606 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.606 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:47.865 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:47.865 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:48.125 [2024-07-12 13:39:36.645287] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.125 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:48.384 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.384 "name": "Existed_Raid", 00:13:48.384 "uuid": "59e16bc6-a490-4b7a-9964-dd6397ce5985", 00:13:48.384 "strip_size_kb": 64, 00:13:48.384 "state": "configuring", 00:13:48.384 "raid_level": "raid0", 00:13:48.384 "superblock": true, 00:13:48.384 "num_base_bdevs": 3, 00:13:48.384 "num_base_bdevs_discovered": 1, 00:13:48.384 "num_base_bdevs_operational": 3, 00:13:48.384 "base_bdevs_list": [ 00:13:48.384 { 00:13:48.384 "name": null, 00:13:48.384 "uuid": "727777cf-0e0d-4817-9481-32943063c204", 00:13:48.384 "is_configured": false, 00:13:48.384 "data_offset": 2048, 00:13:48.384 "data_size": 63488 00:13:48.384 }, 00:13:48.384 { 00:13:48.384 "name": null, 00:13:48.384 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:48.384 "is_configured": false, 00:13:48.384 "data_offset": 2048, 00:13:48.384 "data_size": 63488 00:13:48.384 }, 00:13:48.384 { 00:13:48.384 "name": "BaseBdev3", 00:13:48.384 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:48.384 "is_configured": true, 00:13:48.384 "data_offset": 2048, 00:13:48.384 "data_size": 63488 00:13:48.384 } 00:13:48.384 ] 00:13:48.384 }' 00:13:48.384 13:39:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.384 13:39:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:48.951 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.951 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:49.210 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:49.210 13:39:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:49.468 [2024-07-12 13:39:37.989238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.468 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.726 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.726 "name": "Existed_Raid", 00:13:49.726 "uuid": "59e16bc6-a490-4b7a-9964-dd6397ce5985", 00:13:49.726 "strip_size_kb": 64, 00:13:49.726 "state": "configuring", 00:13:49.726 "raid_level": "raid0", 00:13:49.726 "superblock": true, 00:13:49.726 "num_base_bdevs": 3, 00:13:49.726 "num_base_bdevs_discovered": 2, 00:13:49.726 "num_base_bdevs_operational": 3, 00:13:49.726 "base_bdevs_list": [ 00:13:49.726 { 00:13:49.726 "name": null, 00:13:49.726 "uuid": "727777cf-0e0d-4817-9481-32943063c204", 00:13:49.726 "is_configured": false, 00:13:49.726 "data_offset": 2048, 00:13:49.726 "data_size": 63488 00:13:49.726 }, 00:13:49.726 { 00:13:49.726 "name": "BaseBdev2", 00:13:49.726 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:49.726 "is_configured": true, 00:13:49.726 "data_offset": 2048, 00:13:49.726 "data_size": 63488 00:13:49.726 }, 00:13:49.726 { 00:13:49.726 "name": "BaseBdev3", 00:13:49.726 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:49.726 "is_configured": true, 00:13:49.726 "data_offset": 2048, 00:13:49.726 "data_size": 63488 00:13:49.726 } 00:13:49.726 ] 00:13:49.726 }' 00:13:49.726 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.726 13:39:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:50.292 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.292 13:39:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:50.550 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:50.550 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:50.550 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.809 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 727777cf-0e0d-4817-9481-32943063c204 00:13:51.067 [2024-07-12 13:39:39.585944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:51.067 [2024-07-12 13:39:39.586095] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e1fe90 00:13:51.067 [2024-07-12 13:39:39.586110] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:51.067 [2024-07-12 13:39:39.586287] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e20700 00:13:51.067 [2024-07-12 13:39:39.586399] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e1fe90 00:13:51.067 [2024-07-12 13:39:39.586409] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e1fe90 00:13:51.067 [2024-07-12 13:39:39.586500] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:51.067 NewBaseBdev 00:13:51.067 13:39:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:51.067 13:39:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:51.067 13:39:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:51.067 13:39:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:51.067 13:39:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:51.067 13:39:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:51.067 13:39:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:51.326 13:39:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:51.586 [ 00:13:51.586 { 00:13:51.586 "name": "NewBaseBdev", 00:13:51.586 "aliases": [ 00:13:51.586 "727777cf-0e0d-4817-9481-32943063c204" 00:13:51.586 ], 00:13:51.586 "product_name": "Malloc disk", 00:13:51.586 "block_size": 512, 00:13:51.586 "num_blocks": 65536, 00:13:51.586 "uuid": "727777cf-0e0d-4817-9481-32943063c204", 00:13:51.586 "assigned_rate_limits": { 00:13:51.586 "rw_ios_per_sec": 0, 00:13:51.586 "rw_mbytes_per_sec": 0, 00:13:51.586 "r_mbytes_per_sec": 0, 00:13:51.586 "w_mbytes_per_sec": 0 00:13:51.586 }, 00:13:51.586 "claimed": true, 00:13:51.586 "claim_type": "exclusive_write", 00:13:51.586 "zoned": false, 00:13:51.586 "supported_io_types": { 00:13:51.586 "read": true, 00:13:51.586 "write": true, 00:13:51.586 "unmap": true, 00:13:51.586 "flush": true, 00:13:51.586 "reset": true, 00:13:51.586 "nvme_admin": false, 00:13:51.586 "nvme_io": false, 00:13:51.586 "nvme_io_md": false, 00:13:51.586 "write_zeroes": true, 00:13:51.586 "zcopy": true, 00:13:51.586 "get_zone_info": false, 00:13:51.586 "zone_management": false, 00:13:51.586 "zone_append": false, 00:13:51.586 "compare": false, 00:13:51.586 "compare_and_write": false, 00:13:51.586 "abort": true, 00:13:51.586 "seek_hole": false, 00:13:51.586 "seek_data": false, 00:13:51.586 "copy": true, 00:13:51.586 "nvme_iov_md": false 00:13:51.586 }, 00:13:51.586 "memory_domains": [ 00:13:51.586 { 00:13:51.586 "dma_device_id": "system", 00:13:51.586 "dma_device_type": 1 00:13:51.586 }, 00:13:51.586 { 00:13:51.586 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.586 "dma_device_type": 2 00:13:51.586 } 00:13:51.586 ], 00:13:51.586 "driver_specific": {} 00:13:51.586 } 00:13:51.586 ] 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.586 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:51.845 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:51.845 "name": "Existed_Raid", 00:13:51.845 "uuid": "59e16bc6-a490-4b7a-9964-dd6397ce5985", 00:13:51.845 "strip_size_kb": 64, 00:13:51.845 "state": "online", 00:13:51.845 "raid_level": "raid0", 00:13:51.845 "superblock": true, 00:13:51.845 "num_base_bdevs": 3, 00:13:51.845 "num_base_bdevs_discovered": 3, 00:13:51.845 "num_base_bdevs_operational": 3, 00:13:51.845 "base_bdevs_list": [ 00:13:51.845 { 00:13:51.845 "name": "NewBaseBdev", 00:13:51.845 "uuid": "727777cf-0e0d-4817-9481-32943063c204", 00:13:51.845 "is_configured": true, 00:13:51.845 "data_offset": 2048, 00:13:51.845 "data_size": 63488 00:13:51.845 }, 00:13:51.845 { 00:13:51.845 "name": "BaseBdev2", 00:13:51.845 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:51.845 "is_configured": true, 00:13:51.845 "data_offset": 2048, 00:13:51.845 "data_size": 63488 00:13:51.845 }, 00:13:51.845 { 00:13:51.845 "name": "BaseBdev3", 00:13:51.845 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:51.845 "is_configured": true, 00:13:51.845 "data_offset": 2048, 00:13:51.845 "data_size": 63488 00:13:51.845 } 00:13:51.845 ] 00:13:51.845 }' 00:13:51.845 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:51.845 13:39:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:52.411 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:52.411 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:52.411 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:52.411 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:52.411 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:52.411 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:52.411 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:52.411 13:39:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:52.670 [2024-07-12 13:39:41.098249] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:52.670 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:52.670 "name": "Existed_Raid", 00:13:52.670 "aliases": [ 00:13:52.670 "59e16bc6-a490-4b7a-9964-dd6397ce5985" 00:13:52.670 ], 00:13:52.670 "product_name": "Raid Volume", 00:13:52.670 "block_size": 512, 00:13:52.670 "num_blocks": 190464, 00:13:52.670 "uuid": "59e16bc6-a490-4b7a-9964-dd6397ce5985", 00:13:52.670 "assigned_rate_limits": { 00:13:52.670 "rw_ios_per_sec": 0, 00:13:52.670 "rw_mbytes_per_sec": 0, 00:13:52.670 "r_mbytes_per_sec": 0, 00:13:52.670 "w_mbytes_per_sec": 0 00:13:52.670 }, 00:13:52.670 "claimed": false, 00:13:52.670 "zoned": false, 00:13:52.670 "supported_io_types": { 00:13:52.670 "read": true, 00:13:52.670 "write": true, 00:13:52.670 "unmap": true, 00:13:52.670 "flush": true, 00:13:52.670 "reset": true, 00:13:52.670 "nvme_admin": false, 00:13:52.670 "nvme_io": false, 00:13:52.670 "nvme_io_md": false, 00:13:52.670 "write_zeroes": true, 00:13:52.670 "zcopy": false, 00:13:52.670 "get_zone_info": false, 00:13:52.670 "zone_management": false, 00:13:52.670 "zone_append": false, 00:13:52.670 "compare": false, 00:13:52.670 "compare_and_write": false, 00:13:52.670 "abort": false, 00:13:52.670 "seek_hole": false, 00:13:52.670 "seek_data": false, 00:13:52.670 "copy": false, 00:13:52.670 "nvme_iov_md": false 00:13:52.670 }, 00:13:52.670 "memory_domains": [ 00:13:52.670 { 00:13:52.670 "dma_device_id": "system", 00:13:52.670 "dma_device_type": 1 00:13:52.670 }, 00:13:52.670 { 00:13:52.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.670 "dma_device_type": 2 00:13:52.670 }, 00:13:52.670 { 00:13:52.670 "dma_device_id": "system", 00:13:52.670 "dma_device_type": 1 00:13:52.670 }, 00:13:52.670 { 00:13:52.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.670 "dma_device_type": 2 00:13:52.670 }, 00:13:52.670 { 00:13:52.670 "dma_device_id": "system", 00:13:52.670 "dma_device_type": 1 00:13:52.670 }, 00:13:52.670 { 00:13:52.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.670 "dma_device_type": 2 00:13:52.670 } 00:13:52.670 ], 00:13:52.670 "driver_specific": { 00:13:52.670 "raid": { 00:13:52.670 "uuid": "59e16bc6-a490-4b7a-9964-dd6397ce5985", 00:13:52.670 "strip_size_kb": 64, 00:13:52.670 "state": "online", 00:13:52.670 "raid_level": "raid0", 00:13:52.670 "superblock": true, 00:13:52.670 "num_base_bdevs": 3, 00:13:52.670 "num_base_bdevs_discovered": 3, 00:13:52.670 "num_base_bdevs_operational": 3, 00:13:52.670 "base_bdevs_list": [ 00:13:52.670 { 00:13:52.670 "name": "NewBaseBdev", 00:13:52.670 "uuid": "727777cf-0e0d-4817-9481-32943063c204", 00:13:52.670 "is_configured": true, 00:13:52.670 "data_offset": 2048, 00:13:52.670 "data_size": 63488 00:13:52.670 }, 00:13:52.670 { 00:13:52.670 "name": "BaseBdev2", 00:13:52.670 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:52.670 "is_configured": true, 00:13:52.670 "data_offset": 2048, 00:13:52.670 "data_size": 63488 00:13:52.670 }, 00:13:52.670 { 00:13:52.670 "name": "BaseBdev3", 00:13:52.670 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:52.670 "is_configured": true, 00:13:52.670 "data_offset": 2048, 00:13:52.670 "data_size": 63488 00:13:52.670 } 00:13:52.670 ] 00:13:52.670 } 00:13:52.670 } 00:13:52.670 }' 00:13:52.670 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:52.670 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:52.670 BaseBdev2 00:13:52.670 BaseBdev3' 00:13:52.670 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:52.670 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:52.670 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:53.237 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:53.237 "name": "NewBaseBdev", 00:13:53.237 "aliases": [ 00:13:53.237 "727777cf-0e0d-4817-9481-32943063c204" 00:13:53.237 ], 00:13:53.237 "product_name": "Malloc disk", 00:13:53.237 "block_size": 512, 00:13:53.237 "num_blocks": 65536, 00:13:53.237 "uuid": "727777cf-0e0d-4817-9481-32943063c204", 00:13:53.237 "assigned_rate_limits": { 00:13:53.237 "rw_ios_per_sec": 0, 00:13:53.237 "rw_mbytes_per_sec": 0, 00:13:53.237 "r_mbytes_per_sec": 0, 00:13:53.237 "w_mbytes_per_sec": 0 00:13:53.237 }, 00:13:53.237 "claimed": true, 00:13:53.237 "claim_type": "exclusive_write", 00:13:53.237 "zoned": false, 00:13:53.237 "supported_io_types": { 00:13:53.237 "read": true, 00:13:53.237 "write": true, 00:13:53.237 "unmap": true, 00:13:53.237 "flush": true, 00:13:53.237 "reset": true, 00:13:53.237 "nvme_admin": false, 00:13:53.237 "nvme_io": false, 00:13:53.237 "nvme_io_md": false, 00:13:53.237 "write_zeroes": true, 00:13:53.237 "zcopy": true, 00:13:53.237 "get_zone_info": false, 00:13:53.237 "zone_management": false, 00:13:53.237 "zone_append": false, 00:13:53.237 "compare": false, 00:13:53.237 "compare_and_write": false, 00:13:53.237 "abort": true, 00:13:53.237 "seek_hole": false, 00:13:53.237 "seek_data": false, 00:13:53.237 "copy": true, 00:13:53.237 "nvme_iov_md": false 00:13:53.237 }, 00:13:53.237 "memory_domains": [ 00:13:53.237 { 00:13:53.237 "dma_device_id": "system", 00:13:53.237 "dma_device_type": 1 00:13:53.237 }, 00:13:53.237 { 00:13:53.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.237 "dma_device_type": 2 00:13:53.237 } 00:13:53.237 ], 00:13:53.237 "driver_specific": {} 00:13:53.237 }' 00:13:53.237 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.237 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.237 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:53.237 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.237 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.494 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:53.494 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.494 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.494 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:53.494 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.494 13:39:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.494 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:53.494 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:53.494 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:53.494 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:53.752 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:53.753 "name": "BaseBdev2", 00:13:53.753 "aliases": [ 00:13:53.753 "1e89daef-5c1b-49de-b65d-5c2f0281766c" 00:13:53.753 ], 00:13:53.753 "product_name": "Malloc disk", 00:13:53.753 "block_size": 512, 00:13:53.753 "num_blocks": 65536, 00:13:53.753 "uuid": "1e89daef-5c1b-49de-b65d-5c2f0281766c", 00:13:53.753 "assigned_rate_limits": { 00:13:53.753 "rw_ios_per_sec": 0, 00:13:53.753 "rw_mbytes_per_sec": 0, 00:13:53.753 "r_mbytes_per_sec": 0, 00:13:53.753 "w_mbytes_per_sec": 0 00:13:53.753 }, 00:13:53.753 "claimed": true, 00:13:53.753 "claim_type": "exclusive_write", 00:13:53.753 "zoned": false, 00:13:53.753 "supported_io_types": { 00:13:53.753 "read": true, 00:13:53.753 "write": true, 00:13:53.753 "unmap": true, 00:13:53.753 "flush": true, 00:13:53.753 "reset": true, 00:13:53.753 "nvme_admin": false, 00:13:53.753 "nvme_io": false, 00:13:53.753 "nvme_io_md": false, 00:13:53.753 "write_zeroes": true, 00:13:53.753 "zcopy": true, 00:13:53.753 "get_zone_info": false, 00:13:53.753 "zone_management": false, 00:13:53.753 "zone_append": false, 00:13:53.753 "compare": false, 00:13:53.753 "compare_and_write": false, 00:13:53.753 "abort": true, 00:13:53.753 "seek_hole": false, 00:13:53.753 "seek_data": false, 00:13:53.753 "copy": true, 00:13:53.753 "nvme_iov_md": false 00:13:53.753 }, 00:13:53.753 "memory_domains": [ 00:13:53.753 { 00:13:53.753 "dma_device_id": "system", 00:13:53.753 "dma_device_type": 1 00:13:53.753 }, 00:13:53.753 { 00:13:53.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.753 "dma_device_type": 2 00:13:53.753 } 00:13:53.753 ], 00:13:53.753 "driver_specific": {} 00:13:53.753 }' 00:13:53.753 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.012 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.012 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:54.012 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.012 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.012 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:54.012 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.012 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.270 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:54.270 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.270 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.270 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:54.270 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:54.270 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:54.270 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:54.529 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:54.529 "name": "BaseBdev3", 00:13:54.529 "aliases": [ 00:13:54.529 "10882be9-e1a7-4a75-9036-f43dfc686b6f" 00:13:54.529 ], 00:13:54.529 "product_name": "Malloc disk", 00:13:54.529 "block_size": 512, 00:13:54.529 "num_blocks": 65536, 00:13:54.529 "uuid": "10882be9-e1a7-4a75-9036-f43dfc686b6f", 00:13:54.529 "assigned_rate_limits": { 00:13:54.529 "rw_ios_per_sec": 0, 00:13:54.529 "rw_mbytes_per_sec": 0, 00:13:54.529 "r_mbytes_per_sec": 0, 00:13:54.529 "w_mbytes_per_sec": 0 00:13:54.529 }, 00:13:54.529 "claimed": true, 00:13:54.529 "claim_type": "exclusive_write", 00:13:54.529 "zoned": false, 00:13:54.529 "supported_io_types": { 00:13:54.529 "read": true, 00:13:54.529 "write": true, 00:13:54.529 "unmap": true, 00:13:54.529 "flush": true, 00:13:54.529 "reset": true, 00:13:54.529 "nvme_admin": false, 00:13:54.529 "nvme_io": false, 00:13:54.529 "nvme_io_md": false, 00:13:54.529 "write_zeroes": true, 00:13:54.529 "zcopy": true, 00:13:54.529 "get_zone_info": false, 00:13:54.529 "zone_management": false, 00:13:54.529 "zone_append": false, 00:13:54.529 "compare": false, 00:13:54.529 "compare_and_write": false, 00:13:54.529 "abort": true, 00:13:54.529 "seek_hole": false, 00:13:54.529 "seek_data": false, 00:13:54.529 "copy": true, 00:13:54.529 "nvme_iov_md": false 00:13:54.529 }, 00:13:54.529 "memory_domains": [ 00:13:54.529 { 00:13:54.529 "dma_device_id": "system", 00:13:54.529 "dma_device_type": 1 00:13:54.529 }, 00:13:54.529 { 00:13:54.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.529 "dma_device_type": 2 00:13:54.529 } 00:13:54.529 ], 00:13:54.529 "driver_specific": {} 00:13:54.529 }' 00:13:54.529 13:39:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.529 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.529 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:54.529 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.786 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.786 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:54.786 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.786 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.786 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:54.786 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.786 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:55.043 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:55.043 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:55.302 [2024-07-12 13:39:43.648725] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:55.302 [2024-07-12 13:39:43.648750] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:55.302 [2024-07-12 13:39:43.648805] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:55.302 [2024-07-12 13:39:43.648853] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:55.302 [2024-07-12 13:39:43.648865] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e1fe90 name Existed_Raid, state offline 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 453001 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 453001 ']' 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 453001 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 453001 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 453001' 00:13:55.302 killing process with pid 453001 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 453001 00:13:55.302 [2024-07-12 13:39:43.716269] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:55.302 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 453001 00:13:55.302 [2024-07-12 13:39:43.746669] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:55.561 13:39:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:55.561 00:13:55.561 real 0m28.620s 00:13:55.561 user 0m52.492s 00:13:55.561 sys 0m5.091s 00:13:55.561 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:55.561 13:39:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:55.561 ************************************ 00:13:55.561 END TEST raid_state_function_test_sb 00:13:55.561 ************************************ 00:13:55.561 13:39:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:55.561 13:39:44 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:13:55.561 13:39:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:55.561 13:39:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:55.561 13:39:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:55.561 ************************************ 00:13:55.561 START TEST raid_superblock_test 00:13:55.561 ************************************ 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=457290 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 457290 /var/tmp/spdk-raid.sock 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 457290 ']' 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:55.561 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:55.561 13:39:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:55.561 [2024-07-12 13:39:44.119132] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:13:55.561 [2024-07-12 13:39:44.119200] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid457290 ] 00:13:55.820 [2024-07-12 13:39:44.247567] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:55.820 [2024-07-12 13:39:44.353792] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:56.078 [2024-07-12 13:39:44.425186] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:56.078 [2024-07-12 13:39:44.425225] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:56.647 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:56.906 malloc1 00:13:56.906 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:57.165 [2024-07-12 13:39:45.525078] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:57.165 [2024-07-12 13:39:45.525126] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:57.165 [2024-07-12 13:39:45.525147] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b60e90 00:13:57.165 [2024-07-12 13:39:45.525159] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:57.165 [2024-07-12 13:39:45.526880] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:57.165 [2024-07-12 13:39:45.526909] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:57.165 pt1 00:13:57.165 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:57.165 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:57.165 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:57.165 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:57.166 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:57.166 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:57.166 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:57.166 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:57.166 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:57.424 malloc2 00:13:57.424 13:39:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:57.683 [2024-07-12 13:39:46.020335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:57.683 [2024-07-12 13:39:46.020380] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:57.683 [2024-07-12 13:39:46.020398] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bfefb0 00:13:57.683 [2024-07-12 13:39:46.020417] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:57.683 [2024-07-12 13:39:46.022000] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:57.683 [2024-07-12 13:39:46.022028] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:57.683 pt2 00:13:57.683 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:57.683 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:57.683 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:13:57.683 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:13:57.683 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:57.683 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:57.683 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:57.683 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:57.683 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:57.942 malloc3 00:13:57.942 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:57.942 [2024-07-12 13:39:46.514257] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:57.942 [2024-07-12 13:39:46.514305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:57.942 [2024-07-12 13:39:46.514323] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bffce0 00:13:57.942 [2024-07-12 13:39:46.514336] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:57.942 [2024-07-12 13:39:46.515962] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:57.942 [2024-07-12 13:39:46.515991] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:57.942 pt3 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:58.201 [2024-07-12 13:39:46.758921] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:58.201 [2024-07-12 13:39:46.760284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:58.201 [2024-07-12 13:39:46.760341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:58.201 [2024-07-12 13:39:46.760493] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c029a0 00:13:58.201 [2024-07-12 13:39:46.760504] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:58.201 [2024-07-12 13:39:46.760701] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b63ed0 00:13:58.201 [2024-07-12 13:39:46.760840] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c029a0 00:13:58.201 [2024-07-12 13:39:46.760850] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c029a0 00:13:58.201 [2024-07-12 13:39:46.760959] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.201 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.460 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.460 13:39:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:58.460 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.460 "name": "raid_bdev1", 00:13:58.460 "uuid": "0317f574-9f8c-476f-8ee0-ecec16aeb9c6", 00:13:58.460 "strip_size_kb": 64, 00:13:58.460 "state": "online", 00:13:58.460 "raid_level": "raid0", 00:13:58.460 "superblock": true, 00:13:58.460 "num_base_bdevs": 3, 00:13:58.460 "num_base_bdevs_discovered": 3, 00:13:58.460 "num_base_bdevs_operational": 3, 00:13:58.460 "base_bdevs_list": [ 00:13:58.460 { 00:13:58.460 "name": "pt1", 00:13:58.460 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:58.460 "is_configured": true, 00:13:58.460 "data_offset": 2048, 00:13:58.460 "data_size": 63488 00:13:58.460 }, 00:13:58.460 { 00:13:58.460 "name": "pt2", 00:13:58.460 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:58.460 "is_configured": true, 00:13:58.460 "data_offset": 2048, 00:13:58.460 "data_size": 63488 00:13:58.460 }, 00:13:58.460 { 00:13:58.460 "name": "pt3", 00:13:58.460 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:58.460 "is_configured": true, 00:13:58.460 "data_offset": 2048, 00:13:58.460 "data_size": 63488 00:13:58.460 } 00:13:58.460 ] 00:13:58.460 }' 00:13:58.460 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.460 13:39:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.123 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:59.123 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:59.123 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:59.123 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:59.123 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:59.123 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:59.123 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:59.123 13:39:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:59.723 [2024-07-12 13:39:48.134825] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:59.723 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:59.723 "name": "raid_bdev1", 00:13:59.723 "aliases": [ 00:13:59.723 "0317f574-9f8c-476f-8ee0-ecec16aeb9c6" 00:13:59.723 ], 00:13:59.723 "product_name": "Raid Volume", 00:13:59.723 "block_size": 512, 00:13:59.723 "num_blocks": 190464, 00:13:59.723 "uuid": "0317f574-9f8c-476f-8ee0-ecec16aeb9c6", 00:13:59.723 "assigned_rate_limits": { 00:13:59.723 "rw_ios_per_sec": 0, 00:13:59.723 "rw_mbytes_per_sec": 0, 00:13:59.723 "r_mbytes_per_sec": 0, 00:13:59.723 "w_mbytes_per_sec": 0 00:13:59.723 }, 00:13:59.723 "claimed": false, 00:13:59.723 "zoned": false, 00:13:59.723 "supported_io_types": { 00:13:59.723 "read": true, 00:13:59.723 "write": true, 00:13:59.723 "unmap": true, 00:13:59.723 "flush": true, 00:13:59.723 "reset": true, 00:13:59.723 "nvme_admin": false, 00:13:59.723 "nvme_io": false, 00:13:59.723 "nvme_io_md": false, 00:13:59.723 "write_zeroes": true, 00:13:59.723 "zcopy": false, 00:13:59.723 "get_zone_info": false, 00:13:59.723 "zone_management": false, 00:13:59.723 "zone_append": false, 00:13:59.723 "compare": false, 00:13:59.723 "compare_and_write": false, 00:13:59.723 "abort": false, 00:13:59.723 "seek_hole": false, 00:13:59.723 "seek_data": false, 00:13:59.723 "copy": false, 00:13:59.723 "nvme_iov_md": false 00:13:59.723 }, 00:13:59.723 "memory_domains": [ 00:13:59.723 { 00:13:59.723 "dma_device_id": "system", 00:13:59.723 "dma_device_type": 1 00:13:59.723 }, 00:13:59.723 { 00:13:59.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.723 "dma_device_type": 2 00:13:59.723 }, 00:13:59.724 { 00:13:59.724 "dma_device_id": "system", 00:13:59.724 "dma_device_type": 1 00:13:59.724 }, 00:13:59.724 { 00:13:59.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.724 "dma_device_type": 2 00:13:59.724 }, 00:13:59.724 { 00:13:59.724 "dma_device_id": "system", 00:13:59.724 "dma_device_type": 1 00:13:59.724 }, 00:13:59.724 { 00:13:59.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.724 "dma_device_type": 2 00:13:59.724 } 00:13:59.724 ], 00:13:59.724 "driver_specific": { 00:13:59.724 "raid": { 00:13:59.724 "uuid": "0317f574-9f8c-476f-8ee0-ecec16aeb9c6", 00:13:59.724 "strip_size_kb": 64, 00:13:59.724 "state": "online", 00:13:59.724 "raid_level": "raid0", 00:13:59.724 "superblock": true, 00:13:59.724 "num_base_bdevs": 3, 00:13:59.724 "num_base_bdevs_discovered": 3, 00:13:59.724 "num_base_bdevs_operational": 3, 00:13:59.724 "base_bdevs_list": [ 00:13:59.724 { 00:13:59.724 "name": "pt1", 00:13:59.724 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:59.724 "is_configured": true, 00:13:59.724 "data_offset": 2048, 00:13:59.724 "data_size": 63488 00:13:59.724 }, 00:13:59.724 { 00:13:59.724 "name": "pt2", 00:13:59.724 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:59.724 "is_configured": true, 00:13:59.724 "data_offset": 2048, 00:13:59.724 "data_size": 63488 00:13:59.724 }, 00:13:59.724 { 00:13:59.724 "name": "pt3", 00:13:59.724 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:59.724 "is_configured": true, 00:13:59.724 "data_offset": 2048, 00:13:59.724 "data_size": 63488 00:13:59.724 } 00:13:59.724 ] 00:13:59.724 } 00:13:59.724 } 00:13:59.724 }' 00:13:59.724 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:59.724 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:59.724 pt2 00:13:59.724 pt3' 00:13:59.724 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:59.724 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:59.724 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:00.339 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:00.339 "name": "pt1", 00:14:00.339 "aliases": [ 00:14:00.339 "00000000-0000-0000-0000-000000000001" 00:14:00.339 ], 00:14:00.339 "product_name": "passthru", 00:14:00.339 "block_size": 512, 00:14:00.339 "num_blocks": 65536, 00:14:00.339 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:00.339 "assigned_rate_limits": { 00:14:00.339 "rw_ios_per_sec": 0, 00:14:00.339 "rw_mbytes_per_sec": 0, 00:14:00.339 "r_mbytes_per_sec": 0, 00:14:00.339 "w_mbytes_per_sec": 0 00:14:00.339 }, 00:14:00.339 "claimed": true, 00:14:00.339 "claim_type": "exclusive_write", 00:14:00.339 "zoned": false, 00:14:00.339 "supported_io_types": { 00:14:00.339 "read": true, 00:14:00.339 "write": true, 00:14:00.339 "unmap": true, 00:14:00.339 "flush": true, 00:14:00.339 "reset": true, 00:14:00.339 "nvme_admin": false, 00:14:00.339 "nvme_io": false, 00:14:00.339 "nvme_io_md": false, 00:14:00.339 "write_zeroes": true, 00:14:00.339 "zcopy": true, 00:14:00.339 "get_zone_info": false, 00:14:00.339 "zone_management": false, 00:14:00.339 "zone_append": false, 00:14:00.339 "compare": false, 00:14:00.339 "compare_and_write": false, 00:14:00.339 "abort": true, 00:14:00.339 "seek_hole": false, 00:14:00.339 "seek_data": false, 00:14:00.339 "copy": true, 00:14:00.340 "nvme_iov_md": false 00:14:00.340 }, 00:14:00.340 "memory_domains": [ 00:14:00.340 { 00:14:00.340 "dma_device_id": "system", 00:14:00.340 "dma_device_type": 1 00:14:00.340 }, 00:14:00.340 { 00:14:00.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.340 "dma_device_type": 2 00:14:00.340 } 00:14:00.340 ], 00:14:00.340 "driver_specific": { 00:14:00.340 "passthru": { 00:14:00.340 "name": "pt1", 00:14:00.340 "base_bdev_name": "malloc1" 00:14:00.340 } 00:14:00.340 } 00:14:00.340 }' 00:14:00.340 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.340 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.340 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:00.340 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.614 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.614 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:00.614 13:39:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.614 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.614 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:00.614 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.614 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.886 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:00.887 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:00.887 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:00.887 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:00.887 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:00.887 "name": "pt2", 00:14:00.887 "aliases": [ 00:14:00.887 "00000000-0000-0000-0000-000000000002" 00:14:00.887 ], 00:14:00.887 "product_name": "passthru", 00:14:00.887 "block_size": 512, 00:14:00.887 "num_blocks": 65536, 00:14:00.887 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:00.887 "assigned_rate_limits": { 00:14:00.887 "rw_ios_per_sec": 0, 00:14:00.887 "rw_mbytes_per_sec": 0, 00:14:00.887 "r_mbytes_per_sec": 0, 00:14:00.887 "w_mbytes_per_sec": 0 00:14:00.887 }, 00:14:00.887 "claimed": true, 00:14:00.887 "claim_type": "exclusive_write", 00:14:00.887 "zoned": false, 00:14:00.887 "supported_io_types": { 00:14:00.887 "read": true, 00:14:00.887 "write": true, 00:14:00.887 "unmap": true, 00:14:00.887 "flush": true, 00:14:00.887 "reset": true, 00:14:00.887 "nvme_admin": false, 00:14:00.887 "nvme_io": false, 00:14:00.887 "nvme_io_md": false, 00:14:00.887 "write_zeroes": true, 00:14:00.887 "zcopy": true, 00:14:00.887 "get_zone_info": false, 00:14:00.887 "zone_management": false, 00:14:00.887 "zone_append": false, 00:14:00.887 "compare": false, 00:14:00.887 "compare_and_write": false, 00:14:00.887 "abort": true, 00:14:00.887 "seek_hole": false, 00:14:00.887 "seek_data": false, 00:14:00.887 "copy": true, 00:14:00.887 "nvme_iov_md": false 00:14:00.887 }, 00:14:00.887 "memory_domains": [ 00:14:00.887 { 00:14:00.887 "dma_device_id": "system", 00:14:00.887 "dma_device_type": 1 00:14:00.887 }, 00:14:00.887 { 00:14:00.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.887 "dma_device_type": 2 00:14:00.887 } 00:14:00.887 ], 00:14:00.887 "driver_specific": { 00:14:00.887 "passthru": { 00:14:00.887 "name": "pt2", 00:14:00.887 "base_bdev_name": "malloc2" 00:14:00.887 } 00:14:00.887 } 00:14:00.887 }' 00:14:00.887 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.887 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.157 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:01.157 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:01.157 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:01.157 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:01.157 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:01.157 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:01.157 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:01.157 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:01.157 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:01.435 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:01.435 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:01.435 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:01.435 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:01.435 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:01.435 "name": "pt3", 00:14:01.435 "aliases": [ 00:14:01.435 "00000000-0000-0000-0000-000000000003" 00:14:01.435 ], 00:14:01.435 "product_name": "passthru", 00:14:01.435 "block_size": 512, 00:14:01.435 "num_blocks": 65536, 00:14:01.435 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:01.435 "assigned_rate_limits": { 00:14:01.435 "rw_ios_per_sec": 0, 00:14:01.435 "rw_mbytes_per_sec": 0, 00:14:01.435 "r_mbytes_per_sec": 0, 00:14:01.435 "w_mbytes_per_sec": 0 00:14:01.435 }, 00:14:01.435 "claimed": true, 00:14:01.435 "claim_type": "exclusive_write", 00:14:01.435 "zoned": false, 00:14:01.435 "supported_io_types": { 00:14:01.435 "read": true, 00:14:01.435 "write": true, 00:14:01.435 "unmap": true, 00:14:01.435 "flush": true, 00:14:01.435 "reset": true, 00:14:01.435 "nvme_admin": false, 00:14:01.435 "nvme_io": false, 00:14:01.435 "nvme_io_md": false, 00:14:01.435 "write_zeroes": true, 00:14:01.435 "zcopy": true, 00:14:01.435 "get_zone_info": false, 00:14:01.435 "zone_management": false, 00:14:01.435 "zone_append": false, 00:14:01.435 "compare": false, 00:14:01.435 "compare_and_write": false, 00:14:01.435 "abort": true, 00:14:01.435 "seek_hole": false, 00:14:01.435 "seek_data": false, 00:14:01.435 "copy": true, 00:14:01.435 "nvme_iov_md": false 00:14:01.435 }, 00:14:01.435 "memory_domains": [ 00:14:01.435 { 00:14:01.435 "dma_device_id": "system", 00:14:01.435 "dma_device_type": 1 00:14:01.435 }, 00:14:01.435 { 00:14:01.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.435 "dma_device_type": 2 00:14:01.435 } 00:14:01.435 ], 00:14:01.435 "driver_specific": { 00:14:01.435 "passthru": { 00:14:01.435 "name": "pt3", 00:14:01.435 "base_bdev_name": "malloc3" 00:14:01.435 } 00:14:01.435 } 00:14:01.435 }' 00:14:01.435 13:39:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.728 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.728 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:01.728 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:01.728 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:01.728 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:01.728 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:01.728 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:01.728 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:01.728 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:01.728 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:01.996 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:01.996 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:01.996 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:01.996 [2024-07-12 13:39:50.561336] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:02.261 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0317f574-9f8c-476f-8ee0-ecec16aeb9c6 00:14:02.261 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 0317f574-9f8c-476f-8ee0-ecec16aeb9c6 ']' 00:14:02.261 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:02.261 [2024-07-12 13:39:50.817722] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:02.261 [2024-07-12 13:39:50.817740] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:02.261 [2024-07-12 13:39:50.817787] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:02.261 [2024-07-12 13:39:50.817838] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:02.261 [2024-07-12 13:39:50.817850] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c029a0 name raid_bdev1, state offline 00:14:02.530 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.530 13:39:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:02.530 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:02.530 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:02.530 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:02.530 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:02.812 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:02.812 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:03.094 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:03.094 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:03.674 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:03.674 13:39:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:03.932 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:03.932 [2024-07-12 13:39:52.506112] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:03.932 [2024-07-12 13:39:52.507452] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:03.932 [2024-07-12 13:39:52.507494] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:03.932 [2024-07-12 13:39:52.507540] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:03.932 [2024-07-12 13:39:52.507579] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:03.932 [2024-07-12 13:39:52.507603] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:03.932 [2024-07-12 13:39:52.507620] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:03.932 [2024-07-12 13:39:52.507630] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b64040 name raid_bdev1, state configuring 00:14:03.932 request: 00:14:03.932 { 00:14:03.932 "name": "raid_bdev1", 00:14:03.932 "raid_level": "raid0", 00:14:03.932 "base_bdevs": [ 00:14:03.932 "malloc1", 00:14:03.932 "malloc2", 00:14:03.932 "malloc3" 00:14:03.932 ], 00:14:03.932 "strip_size_kb": 64, 00:14:03.932 "superblock": false, 00:14:03.932 "method": "bdev_raid_create", 00:14:03.932 "req_id": 1 00:14:03.932 } 00:14:03.932 Got JSON-RPC error response 00:14:03.932 response: 00:14:03.932 { 00:14:03.932 "code": -17, 00:14:03.932 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:03.932 } 00:14:04.191 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:04.191 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:04.191 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:04.191 13:39:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:04.191 13:39:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.191 13:39:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:04.450 13:39:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:04.450 13:39:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:04.450 13:39:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:04.450 [2024-07-12 13:39:52.999346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:04.450 [2024-07-12 13:39:52.999384] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:04.450 [2024-07-12 13:39:52.999402] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c00de0 00:14:04.450 [2024-07-12 13:39:52.999414] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:04.450 [2024-07-12 13:39:53.001018] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:04.450 [2024-07-12 13:39:53.001045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:04.450 [2024-07-12 13:39:53.001105] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:04.450 [2024-07-12 13:39:53.001129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:04.450 pt1 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.450 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:04.708 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.708 "name": "raid_bdev1", 00:14:04.708 "uuid": "0317f574-9f8c-476f-8ee0-ecec16aeb9c6", 00:14:04.708 "strip_size_kb": 64, 00:14:04.708 "state": "configuring", 00:14:04.708 "raid_level": "raid0", 00:14:04.708 "superblock": true, 00:14:04.708 "num_base_bdevs": 3, 00:14:04.708 "num_base_bdevs_discovered": 1, 00:14:04.708 "num_base_bdevs_operational": 3, 00:14:04.708 "base_bdevs_list": [ 00:14:04.708 { 00:14:04.708 "name": "pt1", 00:14:04.708 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:04.708 "is_configured": true, 00:14:04.708 "data_offset": 2048, 00:14:04.708 "data_size": 63488 00:14:04.708 }, 00:14:04.708 { 00:14:04.708 "name": null, 00:14:04.708 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:04.708 "is_configured": false, 00:14:04.708 "data_offset": 2048, 00:14:04.708 "data_size": 63488 00:14:04.708 }, 00:14:04.708 { 00:14:04.708 "name": null, 00:14:04.708 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:04.708 "is_configured": false, 00:14:04.708 "data_offset": 2048, 00:14:04.708 "data_size": 63488 00:14:04.708 } 00:14:04.708 ] 00:14:04.708 }' 00:14:04.708 13:39:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.708 13:39:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.643 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:05.643 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:05.902 [2024-07-12 13:39:54.375023] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:05.902 [2024-07-12 13:39:54.375069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:05.902 [2024-07-12 13:39:54.375090] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b610c0 00:14:05.902 [2024-07-12 13:39:54.375103] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:05.902 [2024-07-12 13:39:54.375435] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:05.902 [2024-07-12 13:39:54.375453] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:05.902 [2024-07-12 13:39:54.375512] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:05.902 [2024-07-12 13:39:54.375530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:05.902 pt2 00:14:05.902 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:06.470 [2024-07-12 13:39:54.880383] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.470 13:39:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:06.730 13:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:06.730 "name": "raid_bdev1", 00:14:06.730 "uuid": "0317f574-9f8c-476f-8ee0-ecec16aeb9c6", 00:14:06.730 "strip_size_kb": 64, 00:14:06.730 "state": "configuring", 00:14:06.730 "raid_level": "raid0", 00:14:06.730 "superblock": true, 00:14:06.730 "num_base_bdevs": 3, 00:14:06.730 "num_base_bdevs_discovered": 1, 00:14:06.730 "num_base_bdevs_operational": 3, 00:14:06.730 "base_bdevs_list": [ 00:14:06.730 { 00:14:06.730 "name": "pt1", 00:14:06.730 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:06.730 "is_configured": true, 00:14:06.730 "data_offset": 2048, 00:14:06.730 "data_size": 63488 00:14:06.730 }, 00:14:06.730 { 00:14:06.730 "name": null, 00:14:06.730 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:06.730 "is_configured": false, 00:14:06.730 "data_offset": 2048, 00:14:06.730 "data_size": 63488 00:14:06.730 }, 00:14:06.730 { 00:14:06.730 "name": null, 00:14:06.730 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:06.730 "is_configured": false, 00:14:06.730 "data_offset": 2048, 00:14:06.730 "data_size": 63488 00:14:06.730 } 00:14:06.730 ] 00:14:06.730 }' 00:14:06.730 13:39:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:06.730 13:39:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:07.667 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:07.667 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:07.667 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:07.926 [2024-07-12 13:39:56.260024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:07.926 [2024-07-12 13:39:56.260080] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:07.926 [2024-07-12 13:39:56.260099] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b5fab0 00:14:07.926 [2024-07-12 13:39:56.260111] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:07.926 [2024-07-12 13:39:56.260442] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:07.926 [2024-07-12 13:39:56.260459] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:07.926 [2024-07-12 13:39:56.260521] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:07.926 [2024-07-12 13:39:56.260538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:07.926 pt2 00:14:07.926 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:07.926 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:07.926 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:07.926 [2024-07-12 13:39:56.508683] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:08.186 [2024-07-12 13:39:56.508717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:08.186 [2024-07-12 13:39:56.508733] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c035a0 00:14:08.186 [2024-07-12 13:39:56.508745] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:08.186 [2024-07-12 13:39:56.509038] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:08.186 [2024-07-12 13:39:56.509055] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:08.186 [2024-07-12 13:39:56.509107] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:08.186 [2024-07-12 13:39:56.509124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:08.186 [2024-07-12 13:39:56.509226] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c038c0 00:14:08.186 [2024-07-12 13:39:56.509236] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:08.186 [2024-07-12 13:39:56.509397] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b63ed0 00:14:08.186 [2024-07-12 13:39:56.509515] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c038c0 00:14:08.186 [2024-07-12 13:39:56.509525] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c038c0 00:14:08.186 [2024-07-12 13:39:56.509618] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:08.186 pt3 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.186 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:08.446 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:08.446 "name": "raid_bdev1", 00:14:08.446 "uuid": "0317f574-9f8c-476f-8ee0-ecec16aeb9c6", 00:14:08.446 "strip_size_kb": 64, 00:14:08.446 "state": "online", 00:14:08.446 "raid_level": "raid0", 00:14:08.446 "superblock": true, 00:14:08.446 "num_base_bdevs": 3, 00:14:08.446 "num_base_bdevs_discovered": 3, 00:14:08.446 "num_base_bdevs_operational": 3, 00:14:08.446 "base_bdevs_list": [ 00:14:08.446 { 00:14:08.446 "name": "pt1", 00:14:08.446 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:08.446 "is_configured": true, 00:14:08.446 "data_offset": 2048, 00:14:08.446 "data_size": 63488 00:14:08.446 }, 00:14:08.446 { 00:14:08.446 "name": "pt2", 00:14:08.446 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:08.446 "is_configured": true, 00:14:08.446 "data_offset": 2048, 00:14:08.446 "data_size": 63488 00:14:08.446 }, 00:14:08.446 { 00:14:08.446 "name": "pt3", 00:14:08.447 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:08.447 "is_configured": true, 00:14:08.447 "data_offset": 2048, 00:14:08.447 "data_size": 63488 00:14:08.447 } 00:14:08.447 ] 00:14:08.447 }' 00:14:08.447 13:39:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:08.447 13:39:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:09.016 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:09.016 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:09.016 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:09.016 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:09.016 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:09.016 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:09.016 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:09.016 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:09.276 [2024-07-12 13:39:57.732223] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:09.276 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:09.276 "name": "raid_bdev1", 00:14:09.276 "aliases": [ 00:14:09.276 "0317f574-9f8c-476f-8ee0-ecec16aeb9c6" 00:14:09.276 ], 00:14:09.276 "product_name": "Raid Volume", 00:14:09.276 "block_size": 512, 00:14:09.276 "num_blocks": 190464, 00:14:09.276 "uuid": "0317f574-9f8c-476f-8ee0-ecec16aeb9c6", 00:14:09.276 "assigned_rate_limits": { 00:14:09.276 "rw_ios_per_sec": 0, 00:14:09.276 "rw_mbytes_per_sec": 0, 00:14:09.276 "r_mbytes_per_sec": 0, 00:14:09.276 "w_mbytes_per_sec": 0 00:14:09.276 }, 00:14:09.276 "claimed": false, 00:14:09.276 "zoned": false, 00:14:09.276 "supported_io_types": { 00:14:09.276 "read": true, 00:14:09.276 "write": true, 00:14:09.276 "unmap": true, 00:14:09.276 "flush": true, 00:14:09.276 "reset": true, 00:14:09.276 "nvme_admin": false, 00:14:09.276 "nvme_io": false, 00:14:09.276 "nvme_io_md": false, 00:14:09.276 "write_zeroes": true, 00:14:09.276 "zcopy": false, 00:14:09.276 "get_zone_info": false, 00:14:09.276 "zone_management": false, 00:14:09.276 "zone_append": false, 00:14:09.276 "compare": false, 00:14:09.276 "compare_and_write": false, 00:14:09.276 "abort": false, 00:14:09.276 "seek_hole": false, 00:14:09.276 "seek_data": false, 00:14:09.276 "copy": false, 00:14:09.276 "nvme_iov_md": false 00:14:09.276 }, 00:14:09.276 "memory_domains": [ 00:14:09.276 { 00:14:09.276 "dma_device_id": "system", 00:14:09.276 "dma_device_type": 1 00:14:09.276 }, 00:14:09.276 { 00:14:09.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.276 "dma_device_type": 2 00:14:09.276 }, 00:14:09.276 { 00:14:09.276 "dma_device_id": "system", 00:14:09.276 "dma_device_type": 1 00:14:09.276 }, 00:14:09.276 { 00:14:09.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.276 "dma_device_type": 2 00:14:09.276 }, 00:14:09.276 { 00:14:09.276 "dma_device_id": "system", 00:14:09.276 "dma_device_type": 1 00:14:09.276 }, 00:14:09.276 { 00:14:09.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.276 "dma_device_type": 2 00:14:09.276 } 00:14:09.276 ], 00:14:09.276 "driver_specific": { 00:14:09.276 "raid": { 00:14:09.276 "uuid": "0317f574-9f8c-476f-8ee0-ecec16aeb9c6", 00:14:09.276 "strip_size_kb": 64, 00:14:09.276 "state": "online", 00:14:09.276 "raid_level": "raid0", 00:14:09.276 "superblock": true, 00:14:09.276 "num_base_bdevs": 3, 00:14:09.276 "num_base_bdevs_discovered": 3, 00:14:09.276 "num_base_bdevs_operational": 3, 00:14:09.276 "base_bdevs_list": [ 00:14:09.276 { 00:14:09.276 "name": "pt1", 00:14:09.276 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:09.276 "is_configured": true, 00:14:09.276 "data_offset": 2048, 00:14:09.276 "data_size": 63488 00:14:09.277 }, 00:14:09.277 { 00:14:09.277 "name": "pt2", 00:14:09.277 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:09.277 "is_configured": true, 00:14:09.277 "data_offset": 2048, 00:14:09.277 "data_size": 63488 00:14:09.277 }, 00:14:09.277 { 00:14:09.277 "name": "pt3", 00:14:09.277 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:09.277 "is_configured": true, 00:14:09.277 "data_offset": 2048, 00:14:09.277 "data_size": 63488 00:14:09.277 } 00:14:09.277 ] 00:14:09.277 } 00:14:09.277 } 00:14:09.277 }' 00:14:09.277 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:09.277 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:09.277 pt2 00:14:09.277 pt3' 00:14:09.277 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:09.277 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:09.277 13:39:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:09.537 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:09.537 "name": "pt1", 00:14:09.537 "aliases": [ 00:14:09.537 "00000000-0000-0000-0000-000000000001" 00:14:09.537 ], 00:14:09.537 "product_name": "passthru", 00:14:09.537 "block_size": 512, 00:14:09.537 "num_blocks": 65536, 00:14:09.537 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:09.537 "assigned_rate_limits": { 00:14:09.537 "rw_ios_per_sec": 0, 00:14:09.537 "rw_mbytes_per_sec": 0, 00:14:09.537 "r_mbytes_per_sec": 0, 00:14:09.537 "w_mbytes_per_sec": 0 00:14:09.537 }, 00:14:09.537 "claimed": true, 00:14:09.537 "claim_type": "exclusive_write", 00:14:09.537 "zoned": false, 00:14:09.537 "supported_io_types": { 00:14:09.537 "read": true, 00:14:09.537 "write": true, 00:14:09.537 "unmap": true, 00:14:09.537 "flush": true, 00:14:09.537 "reset": true, 00:14:09.537 "nvme_admin": false, 00:14:09.537 "nvme_io": false, 00:14:09.537 "nvme_io_md": false, 00:14:09.537 "write_zeroes": true, 00:14:09.537 "zcopy": true, 00:14:09.537 "get_zone_info": false, 00:14:09.537 "zone_management": false, 00:14:09.537 "zone_append": false, 00:14:09.537 "compare": false, 00:14:09.537 "compare_and_write": false, 00:14:09.537 "abort": true, 00:14:09.537 "seek_hole": false, 00:14:09.537 "seek_data": false, 00:14:09.537 "copy": true, 00:14:09.537 "nvme_iov_md": false 00:14:09.537 }, 00:14:09.537 "memory_domains": [ 00:14:09.537 { 00:14:09.537 "dma_device_id": "system", 00:14:09.537 "dma_device_type": 1 00:14:09.537 }, 00:14:09.537 { 00:14:09.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.537 "dma_device_type": 2 00:14:09.537 } 00:14:09.537 ], 00:14:09.537 "driver_specific": { 00:14:09.537 "passthru": { 00:14:09.537 "name": "pt1", 00:14:09.537 "base_bdev_name": "malloc1" 00:14:09.537 } 00:14:09.537 } 00:14:09.537 }' 00:14:09.537 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.537 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.796 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:09.796 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.796 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.796 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:09.796 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.796 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.796 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:09.796 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.796 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.055 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:10.055 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:10.055 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:10.055 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:10.314 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:10.314 "name": "pt2", 00:14:10.314 "aliases": [ 00:14:10.314 "00000000-0000-0000-0000-000000000002" 00:14:10.314 ], 00:14:10.314 "product_name": "passthru", 00:14:10.314 "block_size": 512, 00:14:10.314 "num_blocks": 65536, 00:14:10.314 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:10.314 "assigned_rate_limits": { 00:14:10.314 "rw_ios_per_sec": 0, 00:14:10.314 "rw_mbytes_per_sec": 0, 00:14:10.314 "r_mbytes_per_sec": 0, 00:14:10.314 "w_mbytes_per_sec": 0 00:14:10.314 }, 00:14:10.314 "claimed": true, 00:14:10.314 "claim_type": "exclusive_write", 00:14:10.314 "zoned": false, 00:14:10.314 "supported_io_types": { 00:14:10.314 "read": true, 00:14:10.314 "write": true, 00:14:10.314 "unmap": true, 00:14:10.314 "flush": true, 00:14:10.314 "reset": true, 00:14:10.314 "nvme_admin": false, 00:14:10.314 "nvme_io": false, 00:14:10.314 "nvme_io_md": false, 00:14:10.314 "write_zeroes": true, 00:14:10.314 "zcopy": true, 00:14:10.314 "get_zone_info": false, 00:14:10.314 "zone_management": false, 00:14:10.314 "zone_append": false, 00:14:10.314 "compare": false, 00:14:10.314 "compare_and_write": false, 00:14:10.314 "abort": true, 00:14:10.314 "seek_hole": false, 00:14:10.314 "seek_data": false, 00:14:10.314 "copy": true, 00:14:10.314 "nvme_iov_md": false 00:14:10.314 }, 00:14:10.314 "memory_domains": [ 00:14:10.314 { 00:14:10.314 "dma_device_id": "system", 00:14:10.314 "dma_device_type": 1 00:14:10.314 }, 00:14:10.314 { 00:14:10.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.314 "dma_device_type": 2 00:14:10.314 } 00:14:10.314 ], 00:14:10.314 "driver_specific": { 00:14:10.314 "passthru": { 00:14:10.314 "name": "pt2", 00:14:10.314 "base_bdev_name": "malloc2" 00:14:10.314 } 00:14:10.314 } 00:14:10.314 }' 00:14:10.314 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:10.314 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:10.314 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:10.314 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:10.314 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:10.314 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:10.314 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:10.574 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:10.574 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:10.574 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.574 13:39:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.574 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:10.574 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:10.574 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:10.574 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:10.833 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:10.833 "name": "pt3", 00:14:10.833 "aliases": [ 00:14:10.833 "00000000-0000-0000-0000-000000000003" 00:14:10.833 ], 00:14:10.833 "product_name": "passthru", 00:14:10.833 "block_size": 512, 00:14:10.833 "num_blocks": 65536, 00:14:10.833 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:10.833 "assigned_rate_limits": { 00:14:10.833 "rw_ios_per_sec": 0, 00:14:10.833 "rw_mbytes_per_sec": 0, 00:14:10.833 "r_mbytes_per_sec": 0, 00:14:10.833 "w_mbytes_per_sec": 0 00:14:10.833 }, 00:14:10.833 "claimed": true, 00:14:10.833 "claim_type": "exclusive_write", 00:14:10.833 "zoned": false, 00:14:10.833 "supported_io_types": { 00:14:10.833 "read": true, 00:14:10.833 "write": true, 00:14:10.833 "unmap": true, 00:14:10.833 "flush": true, 00:14:10.833 "reset": true, 00:14:10.833 "nvme_admin": false, 00:14:10.833 "nvme_io": false, 00:14:10.833 "nvme_io_md": false, 00:14:10.833 "write_zeroes": true, 00:14:10.833 "zcopy": true, 00:14:10.833 "get_zone_info": false, 00:14:10.833 "zone_management": false, 00:14:10.833 "zone_append": false, 00:14:10.833 "compare": false, 00:14:10.833 "compare_and_write": false, 00:14:10.833 "abort": true, 00:14:10.833 "seek_hole": false, 00:14:10.833 "seek_data": false, 00:14:10.833 "copy": true, 00:14:10.833 "nvme_iov_md": false 00:14:10.833 }, 00:14:10.833 "memory_domains": [ 00:14:10.833 { 00:14:10.833 "dma_device_id": "system", 00:14:10.833 "dma_device_type": 1 00:14:10.833 }, 00:14:10.833 { 00:14:10.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.833 "dma_device_type": 2 00:14:10.833 } 00:14:10.833 ], 00:14:10.833 "driver_specific": { 00:14:10.833 "passthru": { 00:14:10.833 "name": "pt3", 00:14:10.833 "base_bdev_name": "malloc3" 00:14:10.833 } 00:14:10.833 } 00:14:10.833 }' 00:14:10.833 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:10.833 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:10.833 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:10.833 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:10.833 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:11.092 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:11.092 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:11.092 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:11.092 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:11.092 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:11.092 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:11.092 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:11.092 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:11.092 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:11.351 [2024-07-12 13:39:59.837776] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 0317f574-9f8c-476f-8ee0-ecec16aeb9c6 '!=' 0317f574-9f8c-476f-8ee0-ecec16aeb9c6 ']' 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 457290 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 457290 ']' 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 457290 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 457290 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 457290' 00:14:11.352 killing process with pid 457290 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 457290 00:14:11.352 [2024-07-12 13:39:59.907933] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:11.352 [2024-07-12 13:39:59.907988] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:11.352 [2024-07-12 13:39:59.908043] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:11.352 [2024-07-12 13:39:59.908055] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c038c0 name raid_bdev1, state offline 00:14:11.352 13:39:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 457290 00:14:11.611 [2024-07-12 13:39:59.938789] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:11.611 13:40:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:11.611 00:14:11.611 real 0m16.102s 00:14:11.611 user 0m29.074s 00:14:11.611 sys 0m2.838s 00:14:11.611 13:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:11.611 13:40:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:11.611 ************************************ 00:14:11.611 END TEST raid_superblock_test 00:14:11.611 ************************************ 00:14:11.870 13:40:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:11.870 13:40:00 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:14:11.870 13:40:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:11.870 13:40:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:11.870 13:40:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:11.870 ************************************ 00:14:11.870 START TEST raid_read_error_test 00:14:11.870 ************************************ 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.zEHo52ftHD 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=459705 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 459705 /var/tmp/spdk-raid.sock 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 459705 ']' 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:11.870 13:40:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:11.870 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:11.871 13:40:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:11.871 13:40:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:11.871 [2024-07-12 13:40:00.356733] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:14:11.871 [2024-07-12 13:40:00.356868] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid459705 ] 00:14:12.129 [2024-07-12 13:40:00.549200] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:12.129 [2024-07-12 13:40:00.649736] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:12.388 [2024-07-12 13:40:00.715676] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:12.388 [2024-07-12 13:40:00.715708] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:12.955 13:40:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:12.955 13:40:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:12.955 13:40:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:12.955 13:40:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:12.955 BaseBdev1_malloc 00:14:12.955 13:40:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:13.213 true 00:14:13.213 13:40:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:13.472 [2024-07-12 13:40:01.959114] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:13.472 [2024-07-12 13:40:01.959163] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:13.472 [2024-07-12 13:40:01.959184] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1acaa10 00:14:13.472 [2024-07-12 13:40:01.959197] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:13.472 [2024-07-12 13:40:01.961105] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:13.472 [2024-07-12 13:40:01.961135] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:13.472 BaseBdev1 00:14:13.472 13:40:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:13.472 13:40:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:13.731 BaseBdev2_malloc 00:14:13.731 13:40:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:13.990 true 00:14:13.990 13:40:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:14.249 [2024-07-12 13:40:02.698685] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:14.249 [2024-07-12 13:40:02.698733] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:14.249 [2024-07-12 13:40:02.698755] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1acf250 00:14:14.249 [2024-07-12 13:40:02.698768] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:14.249 [2024-07-12 13:40:02.700402] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:14.249 [2024-07-12 13:40:02.700431] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:14.249 BaseBdev2 00:14:14.249 13:40:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:14.249 13:40:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:14.508 BaseBdev3_malloc 00:14:14.508 13:40:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:14.766 true 00:14:14.766 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:15.026 [2024-07-12 13:40:03.429725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:15.026 [2024-07-12 13:40:03.429776] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:15.026 [2024-07-12 13:40:03.429797] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad1510 00:14:15.026 [2024-07-12 13:40:03.429810] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:15.026 [2024-07-12 13:40:03.431414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:15.026 [2024-07-12 13:40:03.431443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:15.026 BaseBdev3 00:14:15.026 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:15.285 [2024-07-12 13:40:03.674409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:15.285 [2024-07-12 13:40:03.675795] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:15.285 [2024-07-12 13:40:03.675864] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:15.285 [2024-07-12 13:40:03.676077] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ad2bc0 00:14:15.285 [2024-07-12 13:40:03.676089] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:15.285 [2024-07-12 13:40:03.676293] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ad2760 00:14:15.285 [2024-07-12 13:40:03.676444] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ad2bc0 00:14:15.285 [2024-07-12 13:40:03.676455] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ad2bc0 00:14:15.285 [2024-07-12 13:40:03.676561] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.285 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:15.544 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.544 "name": "raid_bdev1", 00:14:15.544 "uuid": "242c9a68-0030-4725-8406-4789aff2d916", 00:14:15.544 "strip_size_kb": 64, 00:14:15.544 "state": "online", 00:14:15.544 "raid_level": "raid0", 00:14:15.544 "superblock": true, 00:14:15.544 "num_base_bdevs": 3, 00:14:15.544 "num_base_bdevs_discovered": 3, 00:14:15.544 "num_base_bdevs_operational": 3, 00:14:15.544 "base_bdevs_list": [ 00:14:15.544 { 00:14:15.544 "name": "BaseBdev1", 00:14:15.544 "uuid": "ee15bae4-2696-5d1e-a4df-ca8c74cde074", 00:14:15.544 "is_configured": true, 00:14:15.544 "data_offset": 2048, 00:14:15.544 "data_size": 63488 00:14:15.544 }, 00:14:15.544 { 00:14:15.544 "name": "BaseBdev2", 00:14:15.544 "uuid": "05b6ee7d-3fe8-55f1-9fe3-044d019db730", 00:14:15.544 "is_configured": true, 00:14:15.544 "data_offset": 2048, 00:14:15.544 "data_size": 63488 00:14:15.544 }, 00:14:15.544 { 00:14:15.544 "name": "BaseBdev3", 00:14:15.544 "uuid": "7e8e4739-26eb-5301-b32f-2384856b0606", 00:14:15.544 "is_configured": true, 00:14:15.544 "data_offset": 2048, 00:14:15.544 "data_size": 63488 00:14:15.544 } 00:14:15.544 ] 00:14:15.544 }' 00:14:15.544 13:40:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.544 13:40:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:16.113 13:40:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:16.113 13:40:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:16.113 [2024-07-12 13:40:04.661289] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1920ef0 00:14:17.052 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.312 13:40:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:17.570 13:40:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.570 "name": "raid_bdev1", 00:14:17.571 "uuid": "242c9a68-0030-4725-8406-4789aff2d916", 00:14:17.571 "strip_size_kb": 64, 00:14:17.571 "state": "online", 00:14:17.571 "raid_level": "raid0", 00:14:17.571 "superblock": true, 00:14:17.571 "num_base_bdevs": 3, 00:14:17.571 "num_base_bdevs_discovered": 3, 00:14:17.571 "num_base_bdevs_operational": 3, 00:14:17.571 "base_bdevs_list": [ 00:14:17.571 { 00:14:17.571 "name": "BaseBdev1", 00:14:17.571 "uuid": "ee15bae4-2696-5d1e-a4df-ca8c74cde074", 00:14:17.571 "is_configured": true, 00:14:17.571 "data_offset": 2048, 00:14:17.571 "data_size": 63488 00:14:17.571 }, 00:14:17.571 { 00:14:17.571 "name": "BaseBdev2", 00:14:17.571 "uuid": "05b6ee7d-3fe8-55f1-9fe3-044d019db730", 00:14:17.571 "is_configured": true, 00:14:17.571 "data_offset": 2048, 00:14:17.571 "data_size": 63488 00:14:17.571 }, 00:14:17.571 { 00:14:17.571 "name": "BaseBdev3", 00:14:17.571 "uuid": "7e8e4739-26eb-5301-b32f-2384856b0606", 00:14:17.571 "is_configured": true, 00:14:17.571 "data_offset": 2048, 00:14:17.571 "data_size": 63488 00:14:17.571 } 00:14:17.571 ] 00:14:17.571 }' 00:14:17.571 13:40:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.571 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.140 13:40:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:18.428 [2024-07-12 13:40:06.906866] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:18.428 [2024-07-12 13:40:06.906902] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:18.428 [2024-07-12 13:40:06.910091] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:18.428 [2024-07-12 13:40:06.910130] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:18.428 [2024-07-12 13:40:06.910166] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:18.428 [2024-07-12 13:40:06.910184] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad2bc0 name raid_bdev1, state offline 00:14:18.428 0 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 459705 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 459705 ']' 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 459705 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 459705 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 459705' 00:14:18.428 killing process with pid 459705 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 459705 00:14:18.428 [2024-07-12 13:40:06.977501] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:18.428 13:40:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 459705 00:14:18.688 [2024-07-12 13:40:07.001210] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:18.688 13:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.zEHo52ftHD 00:14:18.688 13:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:18.688 13:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:18.688 13:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:14:18.688 13:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:18.688 13:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:18.688 13:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:18.688 13:40:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:14:18.688 00:14:18.688 real 0m7.011s 00:14:18.688 user 0m11.047s 00:14:18.688 sys 0m1.290s 00:14:18.688 13:40:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:18.688 13:40:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.688 ************************************ 00:14:18.688 END TEST raid_read_error_test 00:14:18.688 ************************************ 00:14:18.948 13:40:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:18.948 13:40:07 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:14:18.948 13:40:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:18.948 13:40:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:18.948 13:40:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:18.948 ************************************ 00:14:18.948 START TEST raid_write_error_test 00:14:18.948 ************************************ 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Op48XazYHG 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=460685 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 460685 /var/tmp/spdk-raid.sock 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 460685 ']' 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:18.948 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:18.948 13:40:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.948 [2024-07-12 13:40:07.414235] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:14:18.948 [2024-07-12 13:40:07.414306] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid460685 ] 00:14:19.207 [2024-07-12 13:40:07.545952] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:19.207 [2024-07-12 13:40:07.649243] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:19.207 [2024-07-12 13:40:07.704677] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:19.207 [2024-07-12 13:40:07.704705] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:19.775 13:40:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:19.775 13:40:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:19.775 13:40:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:19.775 13:40:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:20.033 BaseBdev1_malloc 00:14:20.033 13:40:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:20.292 true 00:14:20.292 13:40:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:20.550 [2024-07-12 13:40:09.076032] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:20.550 [2024-07-12 13:40:09.076078] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:20.550 [2024-07-12 13:40:09.076098] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x141fa10 00:14:20.550 [2024-07-12 13:40:09.076110] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:20.550 [2024-07-12 13:40:09.077819] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:20.550 [2024-07-12 13:40:09.077848] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:20.550 BaseBdev1 00:14:20.550 13:40:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:20.551 13:40:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:20.809 BaseBdev2_malloc 00:14:20.809 13:40:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:21.067 true 00:14:21.067 13:40:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:21.326 [2024-07-12 13:40:09.822552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:21.326 [2024-07-12 13:40:09.822595] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:21.326 [2024-07-12 13:40:09.822617] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1424250 00:14:21.326 [2024-07-12 13:40:09.822629] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:21.326 [2024-07-12 13:40:09.824093] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:21.326 [2024-07-12 13:40:09.824123] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:21.326 BaseBdev2 00:14:21.326 13:40:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:21.326 13:40:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:21.585 BaseBdev3_malloc 00:14:21.585 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:21.845 true 00:14:21.845 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:22.105 [2024-07-12 13:40:10.577317] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:22.105 [2024-07-12 13:40:10.577363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:22.105 [2024-07-12 13:40:10.577384] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1426510 00:14:22.105 [2024-07-12 13:40:10.577396] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:22.105 [2024-07-12 13:40:10.578769] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:22.105 [2024-07-12 13:40:10.578796] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:22.105 BaseBdev3 00:14:22.105 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:22.364 [2024-07-12 13:40:10.822001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:22.364 [2024-07-12 13:40:10.823235] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:22.364 [2024-07-12 13:40:10.823302] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:22.364 [2024-07-12 13:40:10.823502] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1427bc0 00:14:22.365 [2024-07-12 13:40:10.823514] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:22.365 [2024-07-12 13:40:10.823700] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1427760 00:14:22.365 [2024-07-12 13:40:10.823842] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1427bc0 00:14:22.365 [2024-07-12 13:40:10.823852] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1427bc0 00:14:22.365 [2024-07-12 13:40:10.823960] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.365 13:40:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:22.624 13:40:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.624 "name": "raid_bdev1", 00:14:22.624 "uuid": "a00d68c0-f5e9-4023-ac76-143b6857a224", 00:14:22.624 "strip_size_kb": 64, 00:14:22.624 "state": "online", 00:14:22.624 "raid_level": "raid0", 00:14:22.624 "superblock": true, 00:14:22.624 "num_base_bdevs": 3, 00:14:22.624 "num_base_bdevs_discovered": 3, 00:14:22.624 "num_base_bdevs_operational": 3, 00:14:22.624 "base_bdevs_list": [ 00:14:22.624 { 00:14:22.624 "name": "BaseBdev1", 00:14:22.624 "uuid": "8b742702-ad26-5568-8443-9fe6a6f7caa3", 00:14:22.624 "is_configured": true, 00:14:22.624 "data_offset": 2048, 00:14:22.624 "data_size": 63488 00:14:22.624 }, 00:14:22.624 { 00:14:22.624 "name": "BaseBdev2", 00:14:22.624 "uuid": "86e756bd-ed8f-5b89-8ee1-ddf84f4a48f8", 00:14:22.624 "is_configured": true, 00:14:22.624 "data_offset": 2048, 00:14:22.624 "data_size": 63488 00:14:22.624 }, 00:14:22.624 { 00:14:22.624 "name": "BaseBdev3", 00:14:22.624 "uuid": "97521e19-407a-593e-bd52-8a89cac8bfce", 00:14:22.624 "is_configured": true, 00:14:22.624 "data_offset": 2048, 00:14:22.624 "data_size": 63488 00:14:22.624 } 00:14:22.624 ] 00:14:22.624 }' 00:14:22.624 13:40:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.624 13:40:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.191 13:40:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:23.191 13:40:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:23.451 [2024-07-12 13:40:11.784824] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1275ef0 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.387 13:40:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:24.648 13:40:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.648 "name": "raid_bdev1", 00:14:24.648 "uuid": "a00d68c0-f5e9-4023-ac76-143b6857a224", 00:14:24.648 "strip_size_kb": 64, 00:14:24.648 "state": "online", 00:14:24.648 "raid_level": "raid0", 00:14:24.648 "superblock": true, 00:14:24.648 "num_base_bdevs": 3, 00:14:24.648 "num_base_bdevs_discovered": 3, 00:14:24.648 "num_base_bdevs_operational": 3, 00:14:24.648 "base_bdevs_list": [ 00:14:24.648 { 00:14:24.648 "name": "BaseBdev1", 00:14:24.648 "uuid": "8b742702-ad26-5568-8443-9fe6a6f7caa3", 00:14:24.648 "is_configured": true, 00:14:24.648 "data_offset": 2048, 00:14:24.648 "data_size": 63488 00:14:24.648 }, 00:14:24.648 { 00:14:24.648 "name": "BaseBdev2", 00:14:24.648 "uuid": "86e756bd-ed8f-5b89-8ee1-ddf84f4a48f8", 00:14:24.648 "is_configured": true, 00:14:24.648 "data_offset": 2048, 00:14:24.648 "data_size": 63488 00:14:24.648 }, 00:14:24.648 { 00:14:24.648 "name": "BaseBdev3", 00:14:24.648 "uuid": "97521e19-407a-593e-bd52-8a89cac8bfce", 00:14:24.648 "is_configured": true, 00:14:24.648 "data_offset": 2048, 00:14:24.648 "data_size": 63488 00:14:24.648 } 00:14:24.648 ] 00:14:24.648 }' 00:14:24.648 13:40:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.648 13:40:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.581 13:40:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:25.581 [2024-07-12 13:40:14.038901] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:25.581 [2024-07-12 13:40:14.038951] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:25.581 [2024-07-12 13:40:14.042123] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:25.581 [2024-07-12 13:40:14.042164] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:25.581 [2024-07-12 13:40:14.042199] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:25.581 [2024-07-12 13:40:14.042211] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1427bc0 name raid_bdev1, state offline 00:14:25.581 0 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 460685 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 460685 ']' 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 460685 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 460685 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 460685' 00:14:25.581 killing process with pid 460685 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 460685 00:14:25.581 [2024-07-12 13:40:14.104372] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:25.581 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 460685 00:14:25.581 [2024-07-12 13:40:14.128466] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:25.839 13:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Op48XazYHG 00:14:25.839 13:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:25.839 13:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:25.839 13:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:14:25.839 13:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:25.839 13:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:25.839 13:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:25.839 13:40:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:14:25.839 00:14:25.839 real 0m7.040s 00:14:25.839 user 0m11.141s 00:14:25.839 sys 0m1.236s 00:14:25.839 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:25.839 13:40:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.839 ************************************ 00:14:25.839 END TEST raid_write_error_test 00:14:25.839 ************************************ 00:14:25.839 13:40:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:25.839 13:40:14 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:25.839 13:40:14 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:14:25.839 13:40:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:25.839 13:40:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:25.839 13:40:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:26.097 ************************************ 00:14:26.097 START TEST raid_state_function_test 00:14:26.097 ************************************ 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:26.097 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=461671 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 461671' 00:14:26.098 Process raid pid: 461671 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 461671 /var/tmp/spdk-raid.sock 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 461671 ']' 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:26.098 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:26.098 13:40:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.098 [2024-07-12 13:40:14.531329] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:14:26.098 [2024-07-12 13:40:14.531396] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:26.098 [2024-07-12 13:40:14.662792] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:26.356 [2024-07-12 13:40:14.771087] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:26.356 [2024-07-12 13:40:14.841208] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:26.356 [2024-07-12 13:40:14.841243] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:26.922 13:40:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:26.922 13:40:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:26.922 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:27.181 [2024-07-12 13:40:15.681584] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:27.181 [2024-07-12 13:40:15.681630] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:27.181 [2024-07-12 13:40:15.681641] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:27.181 [2024-07-12 13:40:15.681653] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:27.181 [2024-07-12 13:40:15.681662] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:27.181 [2024-07-12 13:40:15.681673] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.181 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:27.439 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:27.439 "name": "Existed_Raid", 00:14:27.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.439 "strip_size_kb": 64, 00:14:27.439 "state": "configuring", 00:14:27.439 "raid_level": "concat", 00:14:27.439 "superblock": false, 00:14:27.439 "num_base_bdevs": 3, 00:14:27.439 "num_base_bdevs_discovered": 0, 00:14:27.439 "num_base_bdevs_operational": 3, 00:14:27.439 "base_bdevs_list": [ 00:14:27.439 { 00:14:27.439 "name": "BaseBdev1", 00:14:27.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.439 "is_configured": false, 00:14:27.439 "data_offset": 0, 00:14:27.439 "data_size": 0 00:14:27.439 }, 00:14:27.439 { 00:14:27.439 "name": "BaseBdev2", 00:14:27.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.439 "is_configured": false, 00:14:27.439 "data_offset": 0, 00:14:27.439 "data_size": 0 00:14:27.439 }, 00:14:27.439 { 00:14:27.439 "name": "BaseBdev3", 00:14:27.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.439 "is_configured": false, 00:14:27.439 "data_offset": 0, 00:14:27.439 "data_size": 0 00:14:27.439 } 00:14:27.439 ] 00:14:27.439 }' 00:14:27.439 13:40:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:27.439 13:40:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:28.008 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:28.267 [2024-07-12 13:40:16.712168] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:28.267 [2024-07-12 13:40:16.712201] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19ce350 name Existed_Raid, state configuring 00:14:28.267 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:28.526 [2024-07-12 13:40:16.892661] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:28.526 [2024-07-12 13:40:16.892694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:28.526 [2024-07-12 13:40:16.892704] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:28.526 [2024-07-12 13:40:16.892716] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:28.526 [2024-07-12 13:40:16.892725] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:28.526 [2024-07-12 13:40:16.892736] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:28.526 13:40:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:28.526 [2024-07-12 13:40:17.079130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:28.526 BaseBdev1 00:14:28.526 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:28.526 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:28.526 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:28.526 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:28.526 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:28.526 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:28.526 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:28.784 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:29.042 [ 00:14:29.042 { 00:14:29.042 "name": "BaseBdev1", 00:14:29.042 "aliases": [ 00:14:29.042 "54e6f78a-715a-46c8-9825-f69a34ea2249" 00:14:29.042 ], 00:14:29.042 "product_name": "Malloc disk", 00:14:29.042 "block_size": 512, 00:14:29.043 "num_blocks": 65536, 00:14:29.043 "uuid": "54e6f78a-715a-46c8-9825-f69a34ea2249", 00:14:29.043 "assigned_rate_limits": { 00:14:29.043 "rw_ios_per_sec": 0, 00:14:29.043 "rw_mbytes_per_sec": 0, 00:14:29.043 "r_mbytes_per_sec": 0, 00:14:29.043 "w_mbytes_per_sec": 0 00:14:29.043 }, 00:14:29.043 "claimed": true, 00:14:29.043 "claim_type": "exclusive_write", 00:14:29.043 "zoned": false, 00:14:29.043 "supported_io_types": { 00:14:29.043 "read": true, 00:14:29.043 "write": true, 00:14:29.043 "unmap": true, 00:14:29.043 "flush": true, 00:14:29.043 "reset": true, 00:14:29.043 "nvme_admin": false, 00:14:29.043 "nvme_io": false, 00:14:29.043 "nvme_io_md": false, 00:14:29.043 "write_zeroes": true, 00:14:29.043 "zcopy": true, 00:14:29.043 "get_zone_info": false, 00:14:29.043 "zone_management": false, 00:14:29.043 "zone_append": false, 00:14:29.043 "compare": false, 00:14:29.043 "compare_and_write": false, 00:14:29.043 "abort": true, 00:14:29.043 "seek_hole": false, 00:14:29.043 "seek_data": false, 00:14:29.043 "copy": true, 00:14:29.043 "nvme_iov_md": false 00:14:29.043 }, 00:14:29.043 "memory_domains": [ 00:14:29.043 { 00:14:29.043 "dma_device_id": "system", 00:14:29.043 "dma_device_type": 1 00:14:29.043 }, 00:14:29.043 { 00:14:29.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.043 "dma_device_type": 2 00:14:29.043 } 00:14:29.043 ], 00:14:29.043 "driver_specific": {} 00:14:29.043 } 00:14:29.043 ] 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.043 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:29.300 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:29.300 "name": "Existed_Raid", 00:14:29.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:29.300 "strip_size_kb": 64, 00:14:29.300 "state": "configuring", 00:14:29.300 "raid_level": "concat", 00:14:29.300 "superblock": false, 00:14:29.300 "num_base_bdevs": 3, 00:14:29.300 "num_base_bdevs_discovered": 1, 00:14:29.300 "num_base_bdevs_operational": 3, 00:14:29.300 "base_bdevs_list": [ 00:14:29.300 { 00:14:29.300 "name": "BaseBdev1", 00:14:29.300 "uuid": "54e6f78a-715a-46c8-9825-f69a34ea2249", 00:14:29.300 "is_configured": true, 00:14:29.301 "data_offset": 0, 00:14:29.301 "data_size": 65536 00:14:29.301 }, 00:14:29.301 { 00:14:29.301 "name": "BaseBdev2", 00:14:29.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:29.301 "is_configured": false, 00:14:29.301 "data_offset": 0, 00:14:29.301 "data_size": 0 00:14:29.301 }, 00:14:29.301 { 00:14:29.301 "name": "BaseBdev3", 00:14:29.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:29.301 "is_configured": false, 00:14:29.301 "data_offset": 0, 00:14:29.301 "data_size": 0 00:14:29.301 } 00:14:29.301 ] 00:14:29.301 }' 00:14:29.301 13:40:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:29.301 13:40:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.867 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:30.124 [2024-07-12 13:40:18.534983] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:30.124 [2024-07-12 13:40:18.535022] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19cdc20 name Existed_Raid, state configuring 00:14:30.124 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:30.382 [2024-07-12 13:40:18.779653] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:30.382 [2024-07-12 13:40:18.781138] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:30.382 [2024-07-12 13:40:18.781169] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:30.382 [2024-07-12 13:40:18.781179] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:30.382 [2024-07-12 13:40:18.781191] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.382 13:40:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:30.640 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.640 "name": "Existed_Raid", 00:14:30.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.640 "strip_size_kb": 64, 00:14:30.640 "state": "configuring", 00:14:30.640 "raid_level": "concat", 00:14:30.640 "superblock": false, 00:14:30.640 "num_base_bdevs": 3, 00:14:30.640 "num_base_bdevs_discovered": 1, 00:14:30.640 "num_base_bdevs_operational": 3, 00:14:30.640 "base_bdevs_list": [ 00:14:30.640 { 00:14:30.640 "name": "BaseBdev1", 00:14:30.640 "uuid": "54e6f78a-715a-46c8-9825-f69a34ea2249", 00:14:30.640 "is_configured": true, 00:14:30.640 "data_offset": 0, 00:14:30.640 "data_size": 65536 00:14:30.640 }, 00:14:30.640 { 00:14:30.640 "name": "BaseBdev2", 00:14:30.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.640 "is_configured": false, 00:14:30.640 "data_offset": 0, 00:14:30.640 "data_size": 0 00:14:30.640 }, 00:14:30.640 { 00:14:30.640 "name": "BaseBdev3", 00:14:30.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.640 "is_configured": false, 00:14:30.640 "data_offset": 0, 00:14:30.640 "data_size": 0 00:14:30.640 } 00:14:30.640 ] 00:14:30.640 }' 00:14:30.640 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.640 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.206 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:31.206 [2024-07-12 13:40:19.749568] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:31.206 BaseBdev2 00:14:31.206 13:40:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:31.206 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:31.206 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:31.206 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:31.206 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:31.206 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:31.206 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:31.464 13:40:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:31.723 [ 00:14:31.723 { 00:14:31.723 "name": "BaseBdev2", 00:14:31.723 "aliases": [ 00:14:31.723 "3d129c00-393b-4180-9ac1-72dee1d5f21a" 00:14:31.723 ], 00:14:31.723 "product_name": "Malloc disk", 00:14:31.723 "block_size": 512, 00:14:31.723 "num_blocks": 65536, 00:14:31.723 "uuid": "3d129c00-393b-4180-9ac1-72dee1d5f21a", 00:14:31.723 "assigned_rate_limits": { 00:14:31.723 "rw_ios_per_sec": 0, 00:14:31.723 "rw_mbytes_per_sec": 0, 00:14:31.723 "r_mbytes_per_sec": 0, 00:14:31.723 "w_mbytes_per_sec": 0 00:14:31.723 }, 00:14:31.723 "claimed": true, 00:14:31.723 "claim_type": "exclusive_write", 00:14:31.723 "zoned": false, 00:14:31.723 "supported_io_types": { 00:14:31.723 "read": true, 00:14:31.723 "write": true, 00:14:31.723 "unmap": true, 00:14:31.723 "flush": true, 00:14:31.723 "reset": true, 00:14:31.723 "nvme_admin": false, 00:14:31.723 "nvme_io": false, 00:14:31.723 "nvme_io_md": false, 00:14:31.723 "write_zeroes": true, 00:14:31.723 "zcopy": true, 00:14:31.723 "get_zone_info": false, 00:14:31.723 "zone_management": false, 00:14:31.723 "zone_append": false, 00:14:31.723 "compare": false, 00:14:31.723 "compare_and_write": false, 00:14:31.723 "abort": true, 00:14:31.723 "seek_hole": false, 00:14:31.723 "seek_data": false, 00:14:31.723 "copy": true, 00:14:31.723 "nvme_iov_md": false 00:14:31.723 }, 00:14:31.723 "memory_domains": [ 00:14:31.723 { 00:14:31.723 "dma_device_id": "system", 00:14:31.723 "dma_device_type": 1 00:14:31.723 }, 00:14:31.723 { 00:14:31.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.723 "dma_device_type": 2 00:14:31.723 } 00:14:31.723 ], 00:14:31.723 "driver_specific": {} 00:14:31.723 } 00:14:31.723 ] 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.723 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:31.984 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.984 "name": "Existed_Raid", 00:14:31.984 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.984 "strip_size_kb": 64, 00:14:31.984 "state": "configuring", 00:14:31.984 "raid_level": "concat", 00:14:31.984 "superblock": false, 00:14:31.984 "num_base_bdevs": 3, 00:14:31.984 "num_base_bdevs_discovered": 2, 00:14:31.984 "num_base_bdevs_operational": 3, 00:14:31.985 "base_bdevs_list": [ 00:14:31.985 { 00:14:31.985 "name": "BaseBdev1", 00:14:31.985 "uuid": "54e6f78a-715a-46c8-9825-f69a34ea2249", 00:14:31.985 "is_configured": true, 00:14:31.985 "data_offset": 0, 00:14:31.985 "data_size": 65536 00:14:31.985 }, 00:14:31.985 { 00:14:31.985 "name": "BaseBdev2", 00:14:31.985 "uuid": "3d129c00-393b-4180-9ac1-72dee1d5f21a", 00:14:31.985 "is_configured": true, 00:14:31.985 "data_offset": 0, 00:14:31.985 "data_size": 65536 00:14:31.985 }, 00:14:31.985 { 00:14:31.985 "name": "BaseBdev3", 00:14:31.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.985 "is_configured": false, 00:14:31.985 "data_offset": 0, 00:14:31.985 "data_size": 0 00:14:31.985 } 00:14:31.985 ] 00:14:31.985 }' 00:14:31.985 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.985 13:40:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.678 13:40:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:32.678 [2024-07-12 13:40:21.160656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:32.678 [2024-07-12 13:40:21.160694] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19ceb10 00:14:32.678 [2024-07-12 13:40:21.160702] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:32.678 [2024-07-12 13:40:21.160889] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19ce7e0 00:14:32.678 [2024-07-12 13:40:21.161020] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19ceb10 00:14:32.678 [2024-07-12 13:40:21.161030] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19ceb10 00:14:32.678 [2024-07-12 13:40:21.161191] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:32.678 BaseBdev3 00:14:32.678 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:32.678 13:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:32.678 13:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:32.678 13:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:32.678 13:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:32.678 13:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:32.678 13:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:32.962 13:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:33.221 [ 00:14:33.221 { 00:14:33.221 "name": "BaseBdev3", 00:14:33.221 "aliases": [ 00:14:33.221 "6aa1120a-8adc-4bbd-a574-2309afe7c3c8" 00:14:33.221 ], 00:14:33.221 "product_name": "Malloc disk", 00:14:33.221 "block_size": 512, 00:14:33.221 "num_blocks": 65536, 00:14:33.221 "uuid": "6aa1120a-8adc-4bbd-a574-2309afe7c3c8", 00:14:33.221 "assigned_rate_limits": { 00:14:33.221 "rw_ios_per_sec": 0, 00:14:33.221 "rw_mbytes_per_sec": 0, 00:14:33.221 "r_mbytes_per_sec": 0, 00:14:33.221 "w_mbytes_per_sec": 0 00:14:33.221 }, 00:14:33.221 "claimed": true, 00:14:33.221 "claim_type": "exclusive_write", 00:14:33.221 "zoned": false, 00:14:33.221 "supported_io_types": { 00:14:33.221 "read": true, 00:14:33.221 "write": true, 00:14:33.221 "unmap": true, 00:14:33.221 "flush": true, 00:14:33.221 "reset": true, 00:14:33.221 "nvme_admin": false, 00:14:33.221 "nvme_io": false, 00:14:33.221 "nvme_io_md": false, 00:14:33.221 "write_zeroes": true, 00:14:33.221 "zcopy": true, 00:14:33.221 "get_zone_info": false, 00:14:33.221 "zone_management": false, 00:14:33.221 "zone_append": false, 00:14:33.221 "compare": false, 00:14:33.221 "compare_and_write": false, 00:14:33.221 "abort": true, 00:14:33.221 "seek_hole": false, 00:14:33.221 "seek_data": false, 00:14:33.221 "copy": true, 00:14:33.221 "nvme_iov_md": false 00:14:33.221 }, 00:14:33.221 "memory_domains": [ 00:14:33.221 { 00:14:33.221 "dma_device_id": "system", 00:14:33.221 "dma_device_type": 1 00:14:33.221 }, 00:14:33.221 { 00:14:33.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.221 "dma_device_type": 2 00:14:33.221 } 00:14:33.221 ], 00:14:33.221 "driver_specific": {} 00:14:33.221 } 00:14:33.221 ] 00:14:33.221 13:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:33.221 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:33.221 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:33.221 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:33.221 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:33.221 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:33.221 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:33.221 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:33.222 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:33.222 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:33.222 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:33.222 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:33.222 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:33.222 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.222 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:33.222 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.222 "name": "Existed_Raid", 00:14:33.222 "uuid": "38a940a7-7c75-41e7-a849-60154787034c", 00:14:33.222 "strip_size_kb": 64, 00:14:33.222 "state": "online", 00:14:33.222 "raid_level": "concat", 00:14:33.222 "superblock": false, 00:14:33.222 "num_base_bdevs": 3, 00:14:33.222 "num_base_bdevs_discovered": 3, 00:14:33.222 "num_base_bdevs_operational": 3, 00:14:33.222 "base_bdevs_list": [ 00:14:33.222 { 00:14:33.222 "name": "BaseBdev1", 00:14:33.222 "uuid": "54e6f78a-715a-46c8-9825-f69a34ea2249", 00:14:33.222 "is_configured": true, 00:14:33.222 "data_offset": 0, 00:14:33.222 "data_size": 65536 00:14:33.222 }, 00:14:33.222 { 00:14:33.222 "name": "BaseBdev2", 00:14:33.222 "uuid": "3d129c00-393b-4180-9ac1-72dee1d5f21a", 00:14:33.222 "is_configured": true, 00:14:33.222 "data_offset": 0, 00:14:33.222 "data_size": 65536 00:14:33.222 }, 00:14:33.222 { 00:14:33.222 "name": "BaseBdev3", 00:14:33.222 "uuid": "6aa1120a-8adc-4bbd-a574-2309afe7c3c8", 00:14:33.222 "is_configured": true, 00:14:33.222 "data_offset": 0, 00:14:33.222 "data_size": 65536 00:14:33.222 } 00:14:33.222 ] 00:14:33.222 }' 00:14:33.222 13:40:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.222 13:40:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.790 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:33.790 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:34.049 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:34.049 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:34.049 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:34.049 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:34.049 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:34.049 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:34.049 [2024-07-12 13:40:22.600780] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:34.049 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:34.049 "name": "Existed_Raid", 00:14:34.049 "aliases": [ 00:14:34.049 "38a940a7-7c75-41e7-a849-60154787034c" 00:14:34.049 ], 00:14:34.049 "product_name": "Raid Volume", 00:14:34.049 "block_size": 512, 00:14:34.049 "num_blocks": 196608, 00:14:34.049 "uuid": "38a940a7-7c75-41e7-a849-60154787034c", 00:14:34.049 "assigned_rate_limits": { 00:14:34.049 "rw_ios_per_sec": 0, 00:14:34.049 "rw_mbytes_per_sec": 0, 00:14:34.049 "r_mbytes_per_sec": 0, 00:14:34.049 "w_mbytes_per_sec": 0 00:14:34.049 }, 00:14:34.049 "claimed": false, 00:14:34.049 "zoned": false, 00:14:34.049 "supported_io_types": { 00:14:34.049 "read": true, 00:14:34.049 "write": true, 00:14:34.049 "unmap": true, 00:14:34.049 "flush": true, 00:14:34.049 "reset": true, 00:14:34.049 "nvme_admin": false, 00:14:34.049 "nvme_io": false, 00:14:34.049 "nvme_io_md": false, 00:14:34.049 "write_zeroes": true, 00:14:34.049 "zcopy": false, 00:14:34.049 "get_zone_info": false, 00:14:34.049 "zone_management": false, 00:14:34.049 "zone_append": false, 00:14:34.049 "compare": false, 00:14:34.049 "compare_and_write": false, 00:14:34.049 "abort": false, 00:14:34.049 "seek_hole": false, 00:14:34.049 "seek_data": false, 00:14:34.049 "copy": false, 00:14:34.049 "nvme_iov_md": false 00:14:34.049 }, 00:14:34.049 "memory_domains": [ 00:14:34.049 { 00:14:34.049 "dma_device_id": "system", 00:14:34.049 "dma_device_type": 1 00:14:34.049 }, 00:14:34.049 { 00:14:34.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.049 "dma_device_type": 2 00:14:34.049 }, 00:14:34.049 { 00:14:34.049 "dma_device_id": "system", 00:14:34.049 "dma_device_type": 1 00:14:34.049 }, 00:14:34.049 { 00:14:34.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.049 "dma_device_type": 2 00:14:34.049 }, 00:14:34.049 { 00:14:34.049 "dma_device_id": "system", 00:14:34.049 "dma_device_type": 1 00:14:34.049 }, 00:14:34.049 { 00:14:34.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.049 "dma_device_type": 2 00:14:34.049 } 00:14:34.049 ], 00:14:34.049 "driver_specific": { 00:14:34.049 "raid": { 00:14:34.049 "uuid": "38a940a7-7c75-41e7-a849-60154787034c", 00:14:34.050 "strip_size_kb": 64, 00:14:34.050 "state": "online", 00:14:34.050 "raid_level": "concat", 00:14:34.050 "superblock": false, 00:14:34.050 "num_base_bdevs": 3, 00:14:34.050 "num_base_bdevs_discovered": 3, 00:14:34.050 "num_base_bdevs_operational": 3, 00:14:34.050 "base_bdevs_list": [ 00:14:34.050 { 00:14:34.050 "name": "BaseBdev1", 00:14:34.050 "uuid": "54e6f78a-715a-46c8-9825-f69a34ea2249", 00:14:34.050 "is_configured": true, 00:14:34.050 "data_offset": 0, 00:14:34.050 "data_size": 65536 00:14:34.050 }, 00:14:34.050 { 00:14:34.050 "name": "BaseBdev2", 00:14:34.050 "uuid": "3d129c00-393b-4180-9ac1-72dee1d5f21a", 00:14:34.050 "is_configured": true, 00:14:34.050 "data_offset": 0, 00:14:34.050 "data_size": 65536 00:14:34.050 }, 00:14:34.050 { 00:14:34.050 "name": "BaseBdev3", 00:14:34.050 "uuid": "6aa1120a-8adc-4bbd-a574-2309afe7c3c8", 00:14:34.050 "is_configured": true, 00:14:34.050 "data_offset": 0, 00:14:34.050 "data_size": 65536 00:14:34.050 } 00:14:34.050 ] 00:14:34.050 } 00:14:34.050 } 00:14:34.050 }' 00:14:34.050 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:34.309 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:34.309 BaseBdev2 00:14:34.309 BaseBdev3' 00:14:34.309 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:34.309 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:34.309 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:34.309 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:34.309 "name": "BaseBdev1", 00:14:34.309 "aliases": [ 00:14:34.309 "54e6f78a-715a-46c8-9825-f69a34ea2249" 00:14:34.309 ], 00:14:34.309 "product_name": "Malloc disk", 00:14:34.309 "block_size": 512, 00:14:34.309 "num_blocks": 65536, 00:14:34.309 "uuid": "54e6f78a-715a-46c8-9825-f69a34ea2249", 00:14:34.309 "assigned_rate_limits": { 00:14:34.309 "rw_ios_per_sec": 0, 00:14:34.309 "rw_mbytes_per_sec": 0, 00:14:34.309 "r_mbytes_per_sec": 0, 00:14:34.309 "w_mbytes_per_sec": 0 00:14:34.309 }, 00:14:34.309 "claimed": true, 00:14:34.309 "claim_type": "exclusive_write", 00:14:34.309 "zoned": false, 00:14:34.309 "supported_io_types": { 00:14:34.309 "read": true, 00:14:34.309 "write": true, 00:14:34.309 "unmap": true, 00:14:34.309 "flush": true, 00:14:34.309 "reset": true, 00:14:34.309 "nvme_admin": false, 00:14:34.309 "nvme_io": false, 00:14:34.309 "nvme_io_md": false, 00:14:34.309 "write_zeroes": true, 00:14:34.309 "zcopy": true, 00:14:34.309 "get_zone_info": false, 00:14:34.309 "zone_management": false, 00:14:34.309 "zone_append": false, 00:14:34.309 "compare": false, 00:14:34.309 "compare_and_write": false, 00:14:34.309 "abort": true, 00:14:34.309 "seek_hole": false, 00:14:34.309 "seek_data": false, 00:14:34.309 "copy": true, 00:14:34.309 "nvme_iov_md": false 00:14:34.309 }, 00:14:34.309 "memory_domains": [ 00:14:34.309 { 00:14:34.309 "dma_device_id": "system", 00:14:34.309 "dma_device_type": 1 00:14:34.309 }, 00:14:34.309 { 00:14:34.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.309 "dma_device_type": 2 00:14:34.309 } 00:14:34.309 ], 00:14:34.309 "driver_specific": {} 00:14:34.309 }' 00:14:34.309 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:34.568 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:34.568 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:34.568 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:34.568 13:40:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:34.568 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:34.568 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:34.568 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:34.568 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:34.568 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:34.826 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:34.826 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:34.826 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:34.826 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:34.826 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:35.084 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:35.084 "name": "BaseBdev2", 00:14:35.084 "aliases": [ 00:14:35.084 "3d129c00-393b-4180-9ac1-72dee1d5f21a" 00:14:35.084 ], 00:14:35.084 "product_name": "Malloc disk", 00:14:35.084 "block_size": 512, 00:14:35.084 "num_blocks": 65536, 00:14:35.084 "uuid": "3d129c00-393b-4180-9ac1-72dee1d5f21a", 00:14:35.084 "assigned_rate_limits": { 00:14:35.084 "rw_ios_per_sec": 0, 00:14:35.084 "rw_mbytes_per_sec": 0, 00:14:35.084 "r_mbytes_per_sec": 0, 00:14:35.084 "w_mbytes_per_sec": 0 00:14:35.084 }, 00:14:35.084 "claimed": true, 00:14:35.084 "claim_type": "exclusive_write", 00:14:35.084 "zoned": false, 00:14:35.084 "supported_io_types": { 00:14:35.084 "read": true, 00:14:35.084 "write": true, 00:14:35.084 "unmap": true, 00:14:35.084 "flush": true, 00:14:35.084 "reset": true, 00:14:35.084 "nvme_admin": false, 00:14:35.084 "nvme_io": false, 00:14:35.084 "nvme_io_md": false, 00:14:35.084 "write_zeroes": true, 00:14:35.084 "zcopy": true, 00:14:35.084 "get_zone_info": false, 00:14:35.084 "zone_management": false, 00:14:35.084 "zone_append": false, 00:14:35.084 "compare": false, 00:14:35.084 "compare_and_write": false, 00:14:35.084 "abort": true, 00:14:35.084 "seek_hole": false, 00:14:35.084 "seek_data": false, 00:14:35.084 "copy": true, 00:14:35.084 "nvme_iov_md": false 00:14:35.084 }, 00:14:35.084 "memory_domains": [ 00:14:35.084 { 00:14:35.084 "dma_device_id": "system", 00:14:35.084 "dma_device_type": 1 00:14:35.084 }, 00:14:35.084 { 00:14:35.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.084 "dma_device_type": 2 00:14:35.084 } 00:14:35.084 ], 00:14:35.084 "driver_specific": {} 00:14:35.084 }' 00:14:35.084 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:35.084 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:35.084 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:35.084 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:35.084 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:35.084 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:35.084 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:35.343 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:35.343 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:35.343 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:35.343 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:35.343 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:35.343 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:35.343 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:35.343 13:40:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:35.602 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:35.602 "name": "BaseBdev3", 00:14:35.602 "aliases": [ 00:14:35.602 "6aa1120a-8adc-4bbd-a574-2309afe7c3c8" 00:14:35.602 ], 00:14:35.602 "product_name": "Malloc disk", 00:14:35.602 "block_size": 512, 00:14:35.602 "num_blocks": 65536, 00:14:35.602 "uuid": "6aa1120a-8adc-4bbd-a574-2309afe7c3c8", 00:14:35.602 "assigned_rate_limits": { 00:14:35.602 "rw_ios_per_sec": 0, 00:14:35.602 "rw_mbytes_per_sec": 0, 00:14:35.602 "r_mbytes_per_sec": 0, 00:14:35.602 "w_mbytes_per_sec": 0 00:14:35.602 }, 00:14:35.602 "claimed": true, 00:14:35.602 "claim_type": "exclusive_write", 00:14:35.602 "zoned": false, 00:14:35.602 "supported_io_types": { 00:14:35.602 "read": true, 00:14:35.602 "write": true, 00:14:35.602 "unmap": true, 00:14:35.602 "flush": true, 00:14:35.602 "reset": true, 00:14:35.602 "nvme_admin": false, 00:14:35.602 "nvme_io": false, 00:14:35.602 "nvme_io_md": false, 00:14:35.602 "write_zeroes": true, 00:14:35.602 "zcopy": true, 00:14:35.602 "get_zone_info": false, 00:14:35.602 "zone_management": false, 00:14:35.602 "zone_append": false, 00:14:35.602 "compare": false, 00:14:35.602 "compare_and_write": false, 00:14:35.602 "abort": true, 00:14:35.602 "seek_hole": false, 00:14:35.602 "seek_data": false, 00:14:35.602 "copy": true, 00:14:35.602 "nvme_iov_md": false 00:14:35.602 }, 00:14:35.602 "memory_domains": [ 00:14:35.602 { 00:14:35.602 "dma_device_id": "system", 00:14:35.602 "dma_device_type": 1 00:14:35.602 }, 00:14:35.602 { 00:14:35.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.602 "dma_device_type": 2 00:14:35.602 } 00:14:35.602 ], 00:14:35.602 "driver_specific": {} 00:14:35.602 }' 00:14:35.602 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:35.602 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:35.860 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:35.860 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:35.860 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:35.860 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:35.860 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:35.860 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:35.860 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:35.860 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:35.860 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:36.119 [2024-07-12 13:40:24.670048] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:36.119 [2024-07-12 13:40:24.670076] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:36.119 [2024-07-12 13:40:24.670116] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.119 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.377 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.377 "name": "Existed_Raid", 00:14:36.377 "uuid": "38a940a7-7c75-41e7-a849-60154787034c", 00:14:36.377 "strip_size_kb": 64, 00:14:36.377 "state": "offline", 00:14:36.377 "raid_level": "concat", 00:14:36.377 "superblock": false, 00:14:36.377 "num_base_bdevs": 3, 00:14:36.377 "num_base_bdevs_discovered": 2, 00:14:36.377 "num_base_bdevs_operational": 2, 00:14:36.377 "base_bdevs_list": [ 00:14:36.377 { 00:14:36.377 "name": null, 00:14:36.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.377 "is_configured": false, 00:14:36.377 "data_offset": 0, 00:14:36.377 "data_size": 65536 00:14:36.377 }, 00:14:36.377 { 00:14:36.377 "name": "BaseBdev2", 00:14:36.377 "uuid": "3d129c00-393b-4180-9ac1-72dee1d5f21a", 00:14:36.377 "is_configured": true, 00:14:36.377 "data_offset": 0, 00:14:36.377 "data_size": 65536 00:14:36.377 }, 00:14:36.377 { 00:14:36.377 "name": "BaseBdev3", 00:14:36.377 "uuid": "6aa1120a-8adc-4bbd-a574-2309afe7c3c8", 00:14:36.377 "is_configured": true, 00:14:36.377 "data_offset": 0, 00:14:36.377 "data_size": 65536 00:14:36.377 } 00:14:36.377 ] 00:14:36.377 }' 00:14:36.377 13:40:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.378 13:40:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.313 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:37.313 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:37.313 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.313 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:37.313 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:37.313 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:37.313 13:40:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:37.572 [2024-07-12 13:40:25.998578] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:37.572 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:37.572 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:37.572 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.572 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:37.830 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:37.830 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:37.830 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:38.088 [2024-07-12 13:40:26.483203] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:38.088 [2024-07-12 13:40:26.483246] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19ceb10 name Existed_Raid, state offline 00:14:38.088 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:38.088 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:38.088 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.088 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:38.347 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:38.347 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:38.347 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:38.347 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:38.347 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:38.347 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:38.605 BaseBdev2 00:14:38.606 13:40:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:38.606 13:40:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:38.606 13:40:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:38.606 13:40:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:38.606 13:40:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:38.606 13:40:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:38.606 13:40:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:38.864 13:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:39.123 [ 00:14:39.123 { 00:14:39.123 "name": "BaseBdev2", 00:14:39.123 "aliases": [ 00:14:39.123 "d1fb5e10-8231-4cfa-93b0-31b8340704a9" 00:14:39.123 ], 00:14:39.123 "product_name": "Malloc disk", 00:14:39.123 "block_size": 512, 00:14:39.123 "num_blocks": 65536, 00:14:39.123 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:39.123 "assigned_rate_limits": { 00:14:39.123 "rw_ios_per_sec": 0, 00:14:39.123 "rw_mbytes_per_sec": 0, 00:14:39.123 "r_mbytes_per_sec": 0, 00:14:39.123 "w_mbytes_per_sec": 0 00:14:39.123 }, 00:14:39.123 "claimed": false, 00:14:39.123 "zoned": false, 00:14:39.123 "supported_io_types": { 00:14:39.123 "read": true, 00:14:39.123 "write": true, 00:14:39.123 "unmap": true, 00:14:39.123 "flush": true, 00:14:39.123 "reset": true, 00:14:39.123 "nvme_admin": false, 00:14:39.123 "nvme_io": false, 00:14:39.123 "nvme_io_md": false, 00:14:39.123 "write_zeroes": true, 00:14:39.123 "zcopy": true, 00:14:39.123 "get_zone_info": false, 00:14:39.123 "zone_management": false, 00:14:39.123 "zone_append": false, 00:14:39.123 "compare": false, 00:14:39.123 "compare_and_write": false, 00:14:39.123 "abort": true, 00:14:39.123 "seek_hole": false, 00:14:39.123 "seek_data": false, 00:14:39.123 "copy": true, 00:14:39.123 "nvme_iov_md": false 00:14:39.123 }, 00:14:39.123 "memory_domains": [ 00:14:39.123 { 00:14:39.123 "dma_device_id": "system", 00:14:39.123 "dma_device_type": 1 00:14:39.123 }, 00:14:39.123 { 00:14:39.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.123 "dma_device_type": 2 00:14:39.123 } 00:14:39.123 ], 00:14:39.123 "driver_specific": {} 00:14:39.123 } 00:14:39.123 ] 00:14:39.123 13:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:39.123 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:39.123 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:39.123 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:39.123 BaseBdev3 00:14:39.381 13:40:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:39.381 13:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:39.381 13:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:39.381 13:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:39.381 13:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:39.381 13:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:39.381 13:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.381 13:40:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:39.639 [ 00:14:39.639 { 00:14:39.639 "name": "BaseBdev3", 00:14:39.639 "aliases": [ 00:14:39.639 "06660049-7179-418a-a079-5401b6882385" 00:14:39.639 ], 00:14:39.639 "product_name": "Malloc disk", 00:14:39.639 "block_size": 512, 00:14:39.639 "num_blocks": 65536, 00:14:39.639 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:39.639 "assigned_rate_limits": { 00:14:39.639 "rw_ios_per_sec": 0, 00:14:39.639 "rw_mbytes_per_sec": 0, 00:14:39.639 "r_mbytes_per_sec": 0, 00:14:39.639 "w_mbytes_per_sec": 0 00:14:39.639 }, 00:14:39.639 "claimed": false, 00:14:39.639 "zoned": false, 00:14:39.639 "supported_io_types": { 00:14:39.639 "read": true, 00:14:39.639 "write": true, 00:14:39.639 "unmap": true, 00:14:39.639 "flush": true, 00:14:39.639 "reset": true, 00:14:39.639 "nvme_admin": false, 00:14:39.639 "nvme_io": false, 00:14:39.639 "nvme_io_md": false, 00:14:39.639 "write_zeroes": true, 00:14:39.639 "zcopy": true, 00:14:39.639 "get_zone_info": false, 00:14:39.639 "zone_management": false, 00:14:39.639 "zone_append": false, 00:14:39.639 "compare": false, 00:14:39.639 "compare_and_write": false, 00:14:39.639 "abort": true, 00:14:39.639 "seek_hole": false, 00:14:39.639 "seek_data": false, 00:14:39.639 "copy": true, 00:14:39.639 "nvme_iov_md": false 00:14:39.639 }, 00:14:39.639 "memory_domains": [ 00:14:39.639 { 00:14:39.639 "dma_device_id": "system", 00:14:39.639 "dma_device_type": 1 00:14:39.639 }, 00:14:39.639 { 00:14:39.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.640 "dma_device_type": 2 00:14:39.640 } 00:14:39.640 ], 00:14:39.640 "driver_specific": {} 00:14:39.640 } 00:14:39.640 ] 00:14:39.640 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:39.640 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:39.640 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:39.640 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:39.898 [2024-07-12 13:40:28.373209] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:39.898 [2024-07-12 13:40:28.373251] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:39.898 [2024-07-12 13:40:28.373271] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:39.898 [2024-07-12 13:40:28.374634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.898 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.157 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.157 "name": "Existed_Raid", 00:14:40.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.157 "strip_size_kb": 64, 00:14:40.157 "state": "configuring", 00:14:40.157 "raid_level": "concat", 00:14:40.157 "superblock": false, 00:14:40.157 "num_base_bdevs": 3, 00:14:40.157 "num_base_bdevs_discovered": 2, 00:14:40.157 "num_base_bdevs_operational": 3, 00:14:40.157 "base_bdevs_list": [ 00:14:40.157 { 00:14:40.157 "name": "BaseBdev1", 00:14:40.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.157 "is_configured": false, 00:14:40.157 "data_offset": 0, 00:14:40.157 "data_size": 0 00:14:40.157 }, 00:14:40.157 { 00:14:40.157 "name": "BaseBdev2", 00:14:40.157 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:40.157 "is_configured": true, 00:14:40.157 "data_offset": 0, 00:14:40.157 "data_size": 65536 00:14:40.157 }, 00:14:40.157 { 00:14:40.157 "name": "BaseBdev3", 00:14:40.157 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:40.157 "is_configured": true, 00:14:40.157 "data_offset": 0, 00:14:40.157 "data_size": 65536 00:14:40.157 } 00:14:40.157 ] 00:14:40.157 }' 00:14:40.157 13:40:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.157 13:40:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.738 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:40.996 [2024-07-12 13:40:29.468079] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:40.996 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:40.996 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:40.996 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:40.996 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:40.996 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:40.996 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:40.996 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.996 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.996 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.996 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.997 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.997 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:41.254 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.254 "name": "Existed_Raid", 00:14:41.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.254 "strip_size_kb": 64, 00:14:41.254 "state": "configuring", 00:14:41.254 "raid_level": "concat", 00:14:41.254 "superblock": false, 00:14:41.255 "num_base_bdevs": 3, 00:14:41.255 "num_base_bdevs_discovered": 1, 00:14:41.255 "num_base_bdevs_operational": 3, 00:14:41.255 "base_bdevs_list": [ 00:14:41.255 { 00:14:41.255 "name": "BaseBdev1", 00:14:41.255 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.255 "is_configured": false, 00:14:41.255 "data_offset": 0, 00:14:41.255 "data_size": 0 00:14:41.255 }, 00:14:41.255 { 00:14:41.255 "name": null, 00:14:41.255 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:41.255 "is_configured": false, 00:14:41.255 "data_offset": 0, 00:14:41.255 "data_size": 65536 00:14:41.255 }, 00:14:41.255 { 00:14:41.255 "name": "BaseBdev3", 00:14:41.255 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:41.255 "is_configured": true, 00:14:41.255 "data_offset": 0, 00:14:41.255 "data_size": 65536 00:14:41.255 } 00:14:41.255 ] 00:14:41.255 }' 00:14:41.255 13:40:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.255 13:40:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.820 13:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.820 13:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:42.078 13:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:42.078 13:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:42.337 [2024-07-12 13:40:30.779398] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:42.337 BaseBdev1 00:14:42.337 13:40:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:42.337 13:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:42.337 13:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:42.337 13:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:42.337 13:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:42.337 13:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:42.337 13:40:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:42.905 13:40:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:43.163 [ 00:14:43.163 { 00:14:43.163 "name": "BaseBdev1", 00:14:43.163 "aliases": [ 00:14:43.163 "696c6a9e-1f39-464c-9aad-4073d4075dd7" 00:14:43.163 ], 00:14:43.163 "product_name": "Malloc disk", 00:14:43.163 "block_size": 512, 00:14:43.163 "num_blocks": 65536, 00:14:43.163 "uuid": "696c6a9e-1f39-464c-9aad-4073d4075dd7", 00:14:43.163 "assigned_rate_limits": { 00:14:43.163 "rw_ios_per_sec": 0, 00:14:43.163 "rw_mbytes_per_sec": 0, 00:14:43.163 "r_mbytes_per_sec": 0, 00:14:43.163 "w_mbytes_per_sec": 0 00:14:43.163 }, 00:14:43.163 "claimed": true, 00:14:43.163 "claim_type": "exclusive_write", 00:14:43.163 "zoned": false, 00:14:43.163 "supported_io_types": { 00:14:43.163 "read": true, 00:14:43.163 "write": true, 00:14:43.163 "unmap": true, 00:14:43.163 "flush": true, 00:14:43.163 "reset": true, 00:14:43.163 "nvme_admin": false, 00:14:43.163 "nvme_io": false, 00:14:43.163 "nvme_io_md": false, 00:14:43.163 "write_zeroes": true, 00:14:43.163 "zcopy": true, 00:14:43.163 "get_zone_info": false, 00:14:43.163 "zone_management": false, 00:14:43.163 "zone_append": false, 00:14:43.163 "compare": false, 00:14:43.163 "compare_and_write": false, 00:14:43.163 "abort": true, 00:14:43.163 "seek_hole": false, 00:14:43.163 "seek_data": false, 00:14:43.163 "copy": true, 00:14:43.163 "nvme_iov_md": false 00:14:43.163 }, 00:14:43.163 "memory_domains": [ 00:14:43.163 { 00:14:43.163 "dma_device_id": "system", 00:14:43.163 "dma_device_type": 1 00:14:43.163 }, 00:14:43.163 { 00:14:43.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.163 "dma_device_type": 2 00:14:43.163 } 00:14:43.163 ], 00:14:43.163 "driver_specific": {} 00:14:43.163 } 00:14:43.163 ] 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.163 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.421 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.421 "name": "Existed_Raid", 00:14:43.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.421 "strip_size_kb": 64, 00:14:43.421 "state": "configuring", 00:14:43.421 "raid_level": "concat", 00:14:43.421 "superblock": false, 00:14:43.421 "num_base_bdevs": 3, 00:14:43.421 "num_base_bdevs_discovered": 2, 00:14:43.421 "num_base_bdevs_operational": 3, 00:14:43.421 "base_bdevs_list": [ 00:14:43.421 { 00:14:43.421 "name": "BaseBdev1", 00:14:43.421 "uuid": "696c6a9e-1f39-464c-9aad-4073d4075dd7", 00:14:43.421 "is_configured": true, 00:14:43.421 "data_offset": 0, 00:14:43.421 "data_size": 65536 00:14:43.421 }, 00:14:43.421 { 00:14:43.421 "name": null, 00:14:43.421 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:43.421 "is_configured": false, 00:14:43.421 "data_offset": 0, 00:14:43.421 "data_size": 65536 00:14:43.421 }, 00:14:43.421 { 00:14:43.421 "name": "BaseBdev3", 00:14:43.421 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:43.421 "is_configured": true, 00:14:43.421 "data_offset": 0, 00:14:43.421 "data_size": 65536 00:14:43.421 } 00:14:43.421 ] 00:14:43.421 }' 00:14:43.421 13:40:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.421 13:40:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.987 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.987 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:44.246 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:44.246 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:44.504 [2024-07-12 13:40:32.864946] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.504 13:40:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:44.762 13:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:44.762 "name": "Existed_Raid", 00:14:44.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:44.762 "strip_size_kb": 64, 00:14:44.762 "state": "configuring", 00:14:44.762 "raid_level": "concat", 00:14:44.762 "superblock": false, 00:14:44.762 "num_base_bdevs": 3, 00:14:44.762 "num_base_bdevs_discovered": 1, 00:14:44.762 "num_base_bdevs_operational": 3, 00:14:44.762 "base_bdevs_list": [ 00:14:44.762 { 00:14:44.762 "name": "BaseBdev1", 00:14:44.762 "uuid": "696c6a9e-1f39-464c-9aad-4073d4075dd7", 00:14:44.762 "is_configured": true, 00:14:44.762 "data_offset": 0, 00:14:44.762 "data_size": 65536 00:14:44.762 }, 00:14:44.762 { 00:14:44.762 "name": null, 00:14:44.762 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:44.762 "is_configured": false, 00:14:44.762 "data_offset": 0, 00:14:44.762 "data_size": 65536 00:14:44.762 }, 00:14:44.762 { 00:14:44.762 "name": null, 00:14:44.762 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:44.762 "is_configured": false, 00:14:44.762 "data_offset": 0, 00:14:44.762 "data_size": 65536 00:14:44.762 } 00:14:44.762 ] 00:14:44.762 }' 00:14:44.762 13:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:44.762 13:40:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:45.327 13:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.327 13:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:45.585 13:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:45.585 13:40:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:45.844 [2024-07-12 13:40:34.200489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.844 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.102 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.102 "name": "Existed_Raid", 00:14:46.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.102 "strip_size_kb": 64, 00:14:46.102 "state": "configuring", 00:14:46.102 "raid_level": "concat", 00:14:46.102 "superblock": false, 00:14:46.102 "num_base_bdevs": 3, 00:14:46.102 "num_base_bdevs_discovered": 2, 00:14:46.102 "num_base_bdevs_operational": 3, 00:14:46.102 "base_bdevs_list": [ 00:14:46.102 { 00:14:46.102 "name": "BaseBdev1", 00:14:46.102 "uuid": "696c6a9e-1f39-464c-9aad-4073d4075dd7", 00:14:46.102 "is_configured": true, 00:14:46.102 "data_offset": 0, 00:14:46.102 "data_size": 65536 00:14:46.102 }, 00:14:46.102 { 00:14:46.102 "name": null, 00:14:46.102 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:46.102 "is_configured": false, 00:14:46.103 "data_offset": 0, 00:14:46.103 "data_size": 65536 00:14:46.103 }, 00:14:46.103 { 00:14:46.103 "name": "BaseBdev3", 00:14:46.103 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:46.103 "is_configured": true, 00:14:46.103 "data_offset": 0, 00:14:46.103 "data_size": 65536 00:14:46.103 } 00:14:46.103 ] 00:14:46.103 }' 00:14:46.103 13:40:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.103 13:40:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:46.670 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.670 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:46.928 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:46.928 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:47.187 [2024-07-12 13:40:35.536050] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.187 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.445 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.445 "name": "Existed_Raid", 00:14:47.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.445 "strip_size_kb": 64, 00:14:47.445 "state": "configuring", 00:14:47.445 "raid_level": "concat", 00:14:47.445 "superblock": false, 00:14:47.445 "num_base_bdevs": 3, 00:14:47.445 "num_base_bdevs_discovered": 1, 00:14:47.445 "num_base_bdevs_operational": 3, 00:14:47.445 "base_bdevs_list": [ 00:14:47.445 { 00:14:47.445 "name": null, 00:14:47.445 "uuid": "696c6a9e-1f39-464c-9aad-4073d4075dd7", 00:14:47.445 "is_configured": false, 00:14:47.445 "data_offset": 0, 00:14:47.445 "data_size": 65536 00:14:47.445 }, 00:14:47.445 { 00:14:47.445 "name": null, 00:14:47.445 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:47.445 "is_configured": false, 00:14:47.445 "data_offset": 0, 00:14:47.445 "data_size": 65536 00:14:47.445 }, 00:14:47.445 { 00:14:47.445 "name": "BaseBdev3", 00:14:47.445 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:47.445 "is_configured": true, 00:14:47.445 "data_offset": 0, 00:14:47.445 "data_size": 65536 00:14:47.445 } 00:14:47.445 ] 00:14:47.445 }' 00:14:47.445 13:40:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.445 13:40:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.012 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.012 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:48.271 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:48.271 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:48.529 [2024-07-12 13:40:36.920144] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:48.529 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:48.529 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:48.529 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:48.529 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:48.530 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.530 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:48.530 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.530 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.530 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.530 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.530 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.530 13:40:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:48.788 13:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:48.788 "name": "Existed_Raid", 00:14:48.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:48.788 "strip_size_kb": 64, 00:14:48.788 "state": "configuring", 00:14:48.788 "raid_level": "concat", 00:14:48.788 "superblock": false, 00:14:48.788 "num_base_bdevs": 3, 00:14:48.788 "num_base_bdevs_discovered": 2, 00:14:48.788 "num_base_bdevs_operational": 3, 00:14:48.788 "base_bdevs_list": [ 00:14:48.788 { 00:14:48.788 "name": null, 00:14:48.788 "uuid": "696c6a9e-1f39-464c-9aad-4073d4075dd7", 00:14:48.788 "is_configured": false, 00:14:48.788 "data_offset": 0, 00:14:48.788 "data_size": 65536 00:14:48.788 }, 00:14:48.788 { 00:14:48.788 "name": "BaseBdev2", 00:14:48.788 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:48.788 "is_configured": true, 00:14:48.788 "data_offset": 0, 00:14:48.788 "data_size": 65536 00:14:48.788 }, 00:14:48.788 { 00:14:48.788 "name": "BaseBdev3", 00:14:48.788 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:48.788 "is_configured": true, 00:14:48.788 "data_offset": 0, 00:14:48.788 "data_size": 65536 00:14:48.788 } 00:14:48.788 ] 00:14:48.788 }' 00:14:48.788 13:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:48.788 13:40:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:49.354 13:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.354 13:40:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:49.612 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:49.612 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.612 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:49.871 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 696c6a9e-1f39-464c-9aad-4073d4075dd7 00:14:50.129 [2024-07-12 13:40:38.489000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:50.129 [2024-07-12 13:40:38.489038] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b75760 00:14:50.129 [2024-07-12 13:40:38.489046] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:50.129 [2024-07-12 13:40:38.489248] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b76c10 00:14:50.129 [2024-07-12 13:40:38.489364] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b75760 00:14:50.129 [2024-07-12 13:40:38.489374] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b75760 00:14:50.129 [2024-07-12 13:40:38.489536] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:50.129 NewBaseBdev 00:14:50.129 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:50.129 13:40:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:50.129 13:40:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:50.129 13:40:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:50.129 13:40:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:50.129 13:40:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:50.129 13:40:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:50.388 13:40:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:50.647 [ 00:14:50.647 { 00:14:50.647 "name": "NewBaseBdev", 00:14:50.647 "aliases": [ 00:14:50.647 "696c6a9e-1f39-464c-9aad-4073d4075dd7" 00:14:50.647 ], 00:14:50.647 "product_name": "Malloc disk", 00:14:50.647 "block_size": 512, 00:14:50.647 "num_blocks": 65536, 00:14:50.647 "uuid": "696c6a9e-1f39-464c-9aad-4073d4075dd7", 00:14:50.647 "assigned_rate_limits": { 00:14:50.647 "rw_ios_per_sec": 0, 00:14:50.647 "rw_mbytes_per_sec": 0, 00:14:50.647 "r_mbytes_per_sec": 0, 00:14:50.647 "w_mbytes_per_sec": 0 00:14:50.647 }, 00:14:50.647 "claimed": true, 00:14:50.647 "claim_type": "exclusive_write", 00:14:50.647 "zoned": false, 00:14:50.647 "supported_io_types": { 00:14:50.647 "read": true, 00:14:50.647 "write": true, 00:14:50.647 "unmap": true, 00:14:50.647 "flush": true, 00:14:50.647 "reset": true, 00:14:50.647 "nvme_admin": false, 00:14:50.647 "nvme_io": false, 00:14:50.647 "nvme_io_md": false, 00:14:50.647 "write_zeroes": true, 00:14:50.647 "zcopy": true, 00:14:50.647 "get_zone_info": false, 00:14:50.647 "zone_management": false, 00:14:50.647 "zone_append": false, 00:14:50.647 "compare": false, 00:14:50.647 "compare_and_write": false, 00:14:50.647 "abort": true, 00:14:50.647 "seek_hole": false, 00:14:50.647 "seek_data": false, 00:14:50.647 "copy": true, 00:14:50.647 "nvme_iov_md": false 00:14:50.647 }, 00:14:50.647 "memory_domains": [ 00:14:50.647 { 00:14:50.647 "dma_device_id": "system", 00:14:50.647 "dma_device_type": 1 00:14:50.647 }, 00:14:50.647 { 00:14:50.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.647 "dma_device_type": 2 00:14:50.647 } 00:14:50.647 ], 00:14:50.647 "driver_specific": {} 00:14:50.647 } 00:14:50.647 ] 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:50.647 13:40:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:50.647 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.647 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:50.906 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:50.906 "name": "Existed_Raid", 00:14:50.906 "uuid": "f4ab3258-089d-41cd-87e4-f8ca73b846a4", 00:14:50.906 "strip_size_kb": 64, 00:14:50.906 "state": "online", 00:14:50.906 "raid_level": "concat", 00:14:50.906 "superblock": false, 00:14:50.906 "num_base_bdevs": 3, 00:14:50.906 "num_base_bdevs_discovered": 3, 00:14:50.906 "num_base_bdevs_operational": 3, 00:14:50.906 "base_bdevs_list": [ 00:14:50.906 { 00:14:50.906 "name": "NewBaseBdev", 00:14:50.906 "uuid": "696c6a9e-1f39-464c-9aad-4073d4075dd7", 00:14:50.906 "is_configured": true, 00:14:50.906 "data_offset": 0, 00:14:50.906 "data_size": 65536 00:14:50.906 }, 00:14:50.906 { 00:14:50.906 "name": "BaseBdev2", 00:14:50.906 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:50.906 "is_configured": true, 00:14:50.906 "data_offset": 0, 00:14:50.906 "data_size": 65536 00:14:50.906 }, 00:14:50.906 { 00:14:50.906 "name": "BaseBdev3", 00:14:50.906 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:50.906 "is_configured": true, 00:14:50.906 "data_offset": 0, 00:14:50.906 "data_size": 65536 00:14:50.906 } 00:14:50.906 ] 00:14:50.906 }' 00:14:50.906 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:50.906 13:40:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.472 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:51.472 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:51.472 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:51.472 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:51.472 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:51.472 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:51.472 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:51.472 13:40:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:51.730 [2024-07-12 13:40:40.073571] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:51.730 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:51.730 "name": "Existed_Raid", 00:14:51.730 "aliases": [ 00:14:51.730 "f4ab3258-089d-41cd-87e4-f8ca73b846a4" 00:14:51.730 ], 00:14:51.730 "product_name": "Raid Volume", 00:14:51.730 "block_size": 512, 00:14:51.730 "num_blocks": 196608, 00:14:51.730 "uuid": "f4ab3258-089d-41cd-87e4-f8ca73b846a4", 00:14:51.730 "assigned_rate_limits": { 00:14:51.730 "rw_ios_per_sec": 0, 00:14:51.730 "rw_mbytes_per_sec": 0, 00:14:51.730 "r_mbytes_per_sec": 0, 00:14:51.730 "w_mbytes_per_sec": 0 00:14:51.730 }, 00:14:51.730 "claimed": false, 00:14:51.730 "zoned": false, 00:14:51.730 "supported_io_types": { 00:14:51.730 "read": true, 00:14:51.730 "write": true, 00:14:51.730 "unmap": true, 00:14:51.730 "flush": true, 00:14:51.730 "reset": true, 00:14:51.730 "nvme_admin": false, 00:14:51.730 "nvme_io": false, 00:14:51.730 "nvme_io_md": false, 00:14:51.730 "write_zeroes": true, 00:14:51.730 "zcopy": false, 00:14:51.730 "get_zone_info": false, 00:14:51.730 "zone_management": false, 00:14:51.730 "zone_append": false, 00:14:51.730 "compare": false, 00:14:51.730 "compare_and_write": false, 00:14:51.730 "abort": false, 00:14:51.731 "seek_hole": false, 00:14:51.731 "seek_data": false, 00:14:51.731 "copy": false, 00:14:51.731 "nvme_iov_md": false 00:14:51.731 }, 00:14:51.731 "memory_domains": [ 00:14:51.731 { 00:14:51.731 "dma_device_id": "system", 00:14:51.731 "dma_device_type": 1 00:14:51.731 }, 00:14:51.731 { 00:14:51.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.731 "dma_device_type": 2 00:14:51.731 }, 00:14:51.731 { 00:14:51.731 "dma_device_id": "system", 00:14:51.731 "dma_device_type": 1 00:14:51.731 }, 00:14:51.731 { 00:14:51.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.731 "dma_device_type": 2 00:14:51.731 }, 00:14:51.731 { 00:14:51.731 "dma_device_id": "system", 00:14:51.731 "dma_device_type": 1 00:14:51.731 }, 00:14:51.731 { 00:14:51.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.731 "dma_device_type": 2 00:14:51.731 } 00:14:51.731 ], 00:14:51.731 "driver_specific": { 00:14:51.731 "raid": { 00:14:51.731 "uuid": "f4ab3258-089d-41cd-87e4-f8ca73b846a4", 00:14:51.731 "strip_size_kb": 64, 00:14:51.731 "state": "online", 00:14:51.731 "raid_level": "concat", 00:14:51.731 "superblock": false, 00:14:51.731 "num_base_bdevs": 3, 00:14:51.731 "num_base_bdevs_discovered": 3, 00:14:51.731 "num_base_bdevs_operational": 3, 00:14:51.731 "base_bdevs_list": [ 00:14:51.731 { 00:14:51.731 "name": "NewBaseBdev", 00:14:51.731 "uuid": "696c6a9e-1f39-464c-9aad-4073d4075dd7", 00:14:51.731 "is_configured": true, 00:14:51.731 "data_offset": 0, 00:14:51.731 "data_size": 65536 00:14:51.731 }, 00:14:51.731 { 00:14:51.731 "name": "BaseBdev2", 00:14:51.731 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:51.731 "is_configured": true, 00:14:51.731 "data_offset": 0, 00:14:51.731 "data_size": 65536 00:14:51.731 }, 00:14:51.731 { 00:14:51.731 "name": "BaseBdev3", 00:14:51.731 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:51.731 "is_configured": true, 00:14:51.731 "data_offset": 0, 00:14:51.731 "data_size": 65536 00:14:51.731 } 00:14:51.731 ] 00:14:51.731 } 00:14:51.731 } 00:14:51.731 }' 00:14:51.731 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:51.731 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:51.731 BaseBdev2 00:14:51.731 BaseBdev3' 00:14:51.731 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:51.731 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:51.731 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:51.989 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:51.989 "name": "NewBaseBdev", 00:14:51.989 "aliases": [ 00:14:51.989 "696c6a9e-1f39-464c-9aad-4073d4075dd7" 00:14:51.989 ], 00:14:51.989 "product_name": "Malloc disk", 00:14:51.989 "block_size": 512, 00:14:51.989 "num_blocks": 65536, 00:14:51.989 "uuid": "696c6a9e-1f39-464c-9aad-4073d4075dd7", 00:14:51.989 "assigned_rate_limits": { 00:14:51.989 "rw_ios_per_sec": 0, 00:14:51.989 "rw_mbytes_per_sec": 0, 00:14:51.989 "r_mbytes_per_sec": 0, 00:14:51.989 "w_mbytes_per_sec": 0 00:14:51.989 }, 00:14:51.989 "claimed": true, 00:14:51.989 "claim_type": "exclusive_write", 00:14:51.989 "zoned": false, 00:14:51.989 "supported_io_types": { 00:14:51.989 "read": true, 00:14:51.989 "write": true, 00:14:51.989 "unmap": true, 00:14:51.989 "flush": true, 00:14:51.989 "reset": true, 00:14:51.989 "nvme_admin": false, 00:14:51.989 "nvme_io": false, 00:14:51.989 "nvme_io_md": false, 00:14:51.989 "write_zeroes": true, 00:14:51.989 "zcopy": true, 00:14:51.989 "get_zone_info": false, 00:14:51.989 "zone_management": false, 00:14:51.989 "zone_append": false, 00:14:51.989 "compare": false, 00:14:51.989 "compare_and_write": false, 00:14:51.989 "abort": true, 00:14:51.989 "seek_hole": false, 00:14:51.989 "seek_data": false, 00:14:51.989 "copy": true, 00:14:51.989 "nvme_iov_md": false 00:14:51.989 }, 00:14:51.989 "memory_domains": [ 00:14:51.989 { 00:14:51.989 "dma_device_id": "system", 00:14:51.989 "dma_device_type": 1 00:14:51.989 }, 00:14:51.989 { 00:14:51.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.989 "dma_device_type": 2 00:14:51.989 } 00:14:51.989 ], 00:14:51.989 "driver_specific": {} 00:14:51.989 }' 00:14:51.989 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.989 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.989 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:51.989 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.989 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.247 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:52.247 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.247 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.247 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:52.247 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.247 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.247 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:52.247 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:52.247 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:52.247 13:40:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:52.506 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:52.506 "name": "BaseBdev2", 00:14:52.506 "aliases": [ 00:14:52.506 "d1fb5e10-8231-4cfa-93b0-31b8340704a9" 00:14:52.506 ], 00:14:52.506 "product_name": "Malloc disk", 00:14:52.506 "block_size": 512, 00:14:52.506 "num_blocks": 65536, 00:14:52.506 "uuid": "d1fb5e10-8231-4cfa-93b0-31b8340704a9", 00:14:52.506 "assigned_rate_limits": { 00:14:52.506 "rw_ios_per_sec": 0, 00:14:52.506 "rw_mbytes_per_sec": 0, 00:14:52.506 "r_mbytes_per_sec": 0, 00:14:52.506 "w_mbytes_per_sec": 0 00:14:52.506 }, 00:14:52.506 "claimed": true, 00:14:52.506 "claim_type": "exclusive_write", 00:14:52.506 "zoned": false, 00:14:52.506 "supported_io_types": { 00:14:52.506 "read": true, 00:14:52.506 "write": true, 00:14:52.506 "unmap": true, 00:14:52.506 "flush": true, 00:14:52.506 "reset": true, 00:14:52.506 "nvme_admin": false, 00:14:52.506 "nvme_io": false, 00:14:52.506 "nvme_io_md": false, 00:14:52.506 "write_zeroes": true, 00:14:52.506 "zcopy": true, 00:14:52.506 "get_zone_info": false, 00:14:52.506 "zone_management": false, 00:14:52.506 "zone_append": false, 00:14:52.506 "compare": false, 00:14:52.506 "compare_and_write": false, 00:14:52.506 "abort": true, 00:14:52.506 "seek_hole": false, 00:14:52.506 "seek_data": false, 00:14:52.506 "copy": true, 00:14:52.506 "nvme_iov_md": false 00:14:52.506 }, 00:14:52.506 "memory_domains": [ 00:14:52.506 { 00:14:52.506 "dma_device_id": "system", 00:14:52.506 "dma_device_type": 1 00:14:52.506 }, 00:14:52.506 { 00:14:52.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.506 "dma_device_type": 2 00:14:52.506 } 00:14:52.506 ], 00:14:52.506 "driver_specific": {} 00:14:52.506 }' 00:14:52.506 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.506 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.506 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:52.764 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.764 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.764 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:52.764 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.764 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.764 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:52.764 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.764 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.022 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:53.022 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:53.022 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:53.022 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:53.022 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:53.022 "name": "BaseBdev3", 00:14:53.022 "aliases": [ 00:14:53.022 "06660049-7179-418a-a079-5401b6882385" 00:14:53.022 ], 00:14:53.022 "product_name": "Malloc disk", 00:14:53.022 "block_size": 512, 00:14:53.022 "num_blocks": 65536, 00:14:53.022 "uuid": "06660049-7179-418a-a079-5401b6882385", 00:14:53.022 "assigned_rate_limits": { 00:14:53.022 "rw_ios_per_sec": 0, 00:14:53.022 "rw_mbytes_per_sec": 0, 00:14:53.022 "r_mbytes_per_sec": 0, 00:14:53.022 "w_mbytes_per_sec": 0 00:14:53.022 }, 00:14:53.022 "claimed": true, 00:14:53.022 "claim_type": "exclusive_write", 00:14:53.022 "zoned": false, 00:14:53.022 "supported_io_types": { 00:14:53.022 "read": true, 00:14:53.022 "write": true, 00:14:53.022 "unmap": true, 00:14:53.022 "flush": true, 00:14:53.022 "reset": true, 00:14:53.022 "nvme_admin": false, 00:14:53.022 "nvme_io": false, 00:14:53.022 "nvme_io_md": false, 00:14:53.023 "write_zeroes": true, 00:14:53.023 "zcopy": true, 00:14:53.023 "get_zone_info": false, 00:14:53.023 "zone_management": false, 00:14:53.023 "zone_append": false, 00:14:53.023 "compare": false, 00:14:53.023 "compare_and_write": false, 00:14:53.023 "abort": true, 00:14:53.023 "seek_hole": false, 00:14:53.023 "seek_data": false, 00:14:53.023 "copy": true, 00:14:53.023 "nvme_iov_md": false 00:14:53.023 }, 00:14:53.023 "memory_domains": [ 00:14:53.023 { 00:14:53.023 "dma_device_id": "system", 00:14:53.023 "dma_device_type": 1 00:14:53.023 }, 00:14:53.023 { 00:14:53.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:53.023 "dma_device_type": 2 00:14:53.023 } 00:14:53.023 ], 00:14:53.023 "driver_specific": {} 00:14:53.023 }' 00:14:53.281 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:53.281 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:53.281 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:53.281 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.281 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.281 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:53.281 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.281 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.281 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:53.281 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.539 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.539 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:53.539 13:40:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:53.539 [2024-07-12 13:40:42.082615] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:53.539 [2024-07-12 13:40:42.082640] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:53.539 [2024-07-12 13:40:42.082695] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:53.539 [2024-07-12 13:40:42.082747] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:53.539 [2024-07-12 13:40:42.082758] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b75760 name Existed_Raid, state offline 00:14:53.539 13:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 461671 00:14:53.539 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 461671 ']' 00:14:53.539 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 461671 00:14:53.539 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:53.539 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:53.539 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 461671 00:14:53.798 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:53.798 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:53.798 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 461671' 00:14:53.798 killing process with pid 461671 00:14:53.798 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 461671 00:14:53.798 [2024-07-12 13:40:42.156871] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:53.798 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 461671 00:14:53.798 [2024-07-12 13:40:42.184042] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:54.056 00:14:54.056 real 0m27.939s 00:14:54.056 user 0m51.258s 00:14:54.056 sys 0m5.076s 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.056 ************************************ 00:14:54.056 END TEST raid_state_function_test 00:14:54.056 ************************************ 00:14:54.056 13:40:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:54.056 13:40:42 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:14:54.056 13:40:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:54.056 13:40:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:54.056 13:40:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:54.056 ************************************ 00:14:54.056 START TEST raid_state_function_test_sb 00:14:54.056 ************************************ 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=465973 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 465973' 00:14:54.056 Process raid pid: 465973 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 465973 /var/tmp/spdk-raid.sock 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 465973 ']' 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:54.056 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:54.056 13:40:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:54.056 [2024-07-12 13:40:42.546348] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:14:54.056 [2024-07-12 13:40:42.546408] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:54.314 [2024-07-12 13:40:42.674906] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:54.314 [2024-07-12 13:40:42.780112] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:54.314 [2024-07-12 13:40:42.844910] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:54.314 [2024-07-12 13:40:42.844945] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:54.572 13:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:54.572 13:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:54.572 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:54.831 [2024-07-12 13:40:43.238407] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:54.831 [2024-07-12 13:40:43.238449] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:54.831 [2024-07-12 13:40:43.238460] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:54.831 [2024-07-12 13:40:43.238471] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:54.831 [2024-07-12 13:40:43.238480] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:54.831 [2024-07-12 13:40:43.238491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.831 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:55.090 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.090 "name": "Existed_Raid", 00:14:55.090 "uuid": "f6939d49-c310-4d8b-9aae-b65a38540f0a", 00:14:55.090 "strip_size_kb": 64, 00:14:55.090 "state": "configuring", 00:14:55.090 "raid_level": "concat", 00:14:55.090 "superblock": true, 00:14:55.090 "num_base_bdevs": 3, 00:14:55.090 "num_base_bdevs_discovered": 0, 00:14:55.090 "num_base_bdevs_operational": 3, 00:14:55.090 "base_bdevs_list": [ 00:14:55.090 { 00:14:55.090 "name": "BaseBdev1", 00:14:55.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.090 "is_configured": false, 00:14:55.090 "data_offset": 0, 00:14:55.090 "data_size": 0 00:14:55.090 }, 00:14:55.090 { 00:14:55.090 "name": "BaseBdev2", 00:14:55.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.090 "is_configured": false, 00:14:55.090 "data_offset": 0, 00:14:55.090 "data_size": 0 00:14:55.090 }, 00:14:55.090 { 00:14:55.090 "name": "BaseBdev3", 00:14:55.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:55.090 "is_configured": false, 00:14:55.090 "data_offset": 0, 00:14:55.090 "data_size": 0 00:14:55.090 } 00:14:55.090 ] 00:14:55.090 }' 00:14:55.090 13:40:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.090 13:40:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:55.657 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:55.916 [2024-07-12 13:40:44.317103] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:55.916 [2024-07-12 13:40:44.317136] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d27350 name Existed_Raid, state configuring 00:14:55.916 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:56.175 [2024-07-12 13:40:44.561781] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:56.175 [2024-07-12 13:40:44.561819] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:56.175 [2024-07-12 13:40:44.561834] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:56.175 [2024-07-12 13:40:44.561846] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:56.175 [2024-07-12 13:40:44.561855] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:56.175 [2024-07-12 13:40:44.561866] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:56.175 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:56.434 [2024-07-12 13:40:44.817636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:56.434 BaseBdev1 00:14:56.434 13:40:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:56.434 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:56.434 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:56.434 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:56.434 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:56.434 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:56.434 13:40:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:56.693 13:40:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:56.952 [ 00:14:56.952 { 00:14:56.952 "name": "BaseBdev1", 00:14:56.952 "aliases": [ 00:14:56.952 "79ffcc1a-ab60-457d-9c42-c7522a4ce1a9" 00:14:56.952 ], 00:14:56.952 "product_name": "Malloc disk", 00:14:56.952 "block_size": 512, 00:14:56.952 "num_blocks": 65536, 00:14:56.952 "uuid": "79ffcc1a-ab60-457d-9c42-c7522a4ce1a9", 00:14:56.952 "assigned_rate_limits": { 00:14:56.952 "rw_ios_per_sec": 0, 00:14:56.952 "rw_mbytes_per_sec": 0, 00:14:56.952 "r_mbytes_per_sec": 0, 00:14:56.952 "w_mbytes_per_sec": 0 00:14:56.952 }, 00:14:56.952 "claimed": true, 00:14:56.952 "claim_type": "exclusive_write", 00:14:56.952 "zoned": false, 00:14:56.952 "supported_io_types": { 00:14:56.952 "read": true, 00:14:56.952 "write": true, 00:14:56.952 "unmap": true, 00:14:56.952 "flush": true, 00:14:56.952 "reset": true, 00:14:56.952 "nvme_admin": false, 00:14:56.952 "nvme_io": false, 00:14:56.952 "nvme_io_md": false, 00:14:56.952 "write_zeroes": true, 00:14:56.952 "zcopy": true, 00:14:56.952 "get_zone_info": false, 00:14:56.952 "zone_management": false, 00:14:56.952 "zone_append": false, 00:14:56.952 "compare": false, 00:14:56.952 "compare_and_write": false, 00:14:56.952 "abort": true, 00:14:56.952 "seek_hole": false, 00:14:56.952 "seek_data": false, 00:14:56.952 "copy": true, 00:14:56.952 "nvme_iov_md": false 00:14:56.952 }, 00:14:56.952 "memory_domains": [ 00:14:56.952 { 00:14:56.952 "dma_device_id": "system", 00:14:56.952 "dma_device_type": 1 00:14:56.952 }, 00:14:56.952 { 00:14:56.952 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.952 "dma_device_type": 2 00:14:56.952 } 00:14:56.952 ], 00:14:56.952 "driver_specific": {} 00:14:56.952 } 00:14:56.952 ] 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.952 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:57.210 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:57.210 "name": "Existed_Raid", 00:14:57.210 "uuid": "eef090be-e832-4e3e-b107-676fcde4e241", 00:14:57.210 "strip_size_kb": 64, 00:14:57.210 "state": "configuring", 00:14:57.210 "raid_level": "concat", 00:14:57.210 "superblock": true, 00:14:57.210 "num_base_bdevs": 3, 00:14:57.210 "num_base_bdevs_discovered": 1, 00:14:57.210 "num_base_bdevs_operational": 3, 00:14:57.210 "base_bdevs_list": [ 00:14:57.210 { 00:14:57.210 "name": "BaseBdev1", 00:14:57.210 "uuid": "79ffcc1a-ab60-457d-9c42-c7522a4ce1a9", 00:14:57.210 "is_configured": true, 00:14:57.210 "data_offset": 2048, 00:14:57.210 "data_size": 63488 00:14:57.210 }, 00:14:57.210 { 00:14:57.210 "name": "BaseBdev2", 00:14:57.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:57.210 "is_configured": false, 00:14:57.210 "data_offset": 0, 00:14:57.210 "data_size": 0 00:14:57.210 }, 00:14:57.210 { 00:14:57.210 "name": "BaseBdev3", 00:14:57.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:57.210 "is_configured": false, 00:14:57.210 "data_offset": 0, 00:14:57.210 "data_size": 0 00:14:57.210 } 00:14:57.210 ] 00:14:57.210 }' 00:14:57.210 13:40:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:57.210 13:40:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:57.778 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:57.778 [2024-07-12 13:40:46.297579] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:57.778 [2024-07-12 13:40:46.297620] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d26c20 name Existed_Raid, state configuring 00:14:57.778 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:58.036 [2024-07-12 13:40:46.474086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:58.036 [2024-07-12 13:40:46.475559] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:58.036 [2024-07-12 13:40:46.475592] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:58.036 [2024-07-12 13:40:46.475602] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:58.036 [2024-07-12 13:40:46.475613] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:58.036 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.293 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:58.293 "name": "Existed_Raid", 00:14:58.293 "uuid": "f2606708-f039-4ab5-8fc4-7466db7baa9e", 00:14:58.293 "strip_size_kb": 64, 00:14:58.293 "state": "configuring", 00:14:58.293 "raid_level": "concat", 00:14:58.293 "superblock": true, 00:14:58.293 "num_base_bdevs": 3, 00:14:58.293 "num_base_bdevs_discovered": 1, 00:14:58.293 "num_base_bdevs_operational": 3, 00:14:58.293 "base_bdevs_list": [ 00:14:58.293 { 00:14:58.293 "name": "BaseBdev1", 00:14:58.293 "uuid": "79ffcc1a-ab60-457d-9c42-c7522a4ce1a9", 00:14:58.293 "is_configured": true, 00:14:58.293 "data_offset": 2048, 00:14:58.293 "data_size": 63488 00:14:58.293 }, 00:14:58.293 { 00:14:58.293 "name": "BaseBdev2", 00:14:58.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.293 "is_configured": false, 00:14:58.293 "data_offset": 0, 00:14:58.293 "data_size": 0 00:14:58.293 }, 00:14:58.293 { 00:14:58.293 "name": "BaseBdev3", 00:14:58.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:58.293 "is_configured": false, 00:14:58.293 "data_offset": 0, 00:14:58.293 "data_size": 0 00:14:58.293 } 00:14:58.293 ] 00:14:58.293 }' 00:14:58.293 13:40:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:58.293 13:40:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:58.859 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:58.859 [2024-07-12 13:40:47.347907] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:58.859 BaseBdev2 00:14:58.859 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:58.859 13:40:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:58.859 13:40:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:58.859 13:40:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:58.859 13:40:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:58.859 13:40:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:58.859 13:40:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:59.117 13:40:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:59.375 [ 00:14:59.375 { 00:14:59.375 "name": "BaseBdev2", 00:14:59.375 "aliases": [ 00:14:59.375 "7d800d1d-20b7-4287-ac7e-0c737f03fc79" 00:14:59.375 ], 00:14:59.375 "product_name": "Malloc disk", 00:14:59.375 "block_size": 512, 00:14:59.375 "num_blocks": 65536, 00:14:59.375 "uuid": "7d800d1d-20b7-4287-ac7e-0c737f03fc79", 00:14:59.375 "assigned_rate_limits": { 00:14:59.375 "rw_ios_per_sec": 0, 00:14:59.375 "rw_mbytes_per_sec": 0, 00:14:59.375 "r_mbytes_per_sec": 0, 00:14:59.375 "w_mbytes_per_sec": 0 00:14:59.375 }, 00:14:59.375 "claimed": true, 00:14:59.375 "claim_type": "exclusive_write", 00:14:59.375 "zoned": false, 00:14:59.375 "supported_io_types": { 00:14:59.375 "read": true, 00:14:59.375 "write": true, 00:14:59.375 "unmap": true, 00:14:59.375 "flush": true, 00:14:59.375 "reset": true, 00:14:59.375 "nvme_admin": false, 00:14:59.375 "nvme_io": false, 00:14:59.375 "nvme_io_md": false, 00:14:59.375 "write_zeroes": true, 00:14:59.375 "zcopy": true, 00:14:59.375 "get_zone_info": false, 00:14:59.375 "zone_management": false, 00:14:59.375 "zone_append": false, 00:14:59.375 "compare": false, 00:14:59.375 "compare_and_write": false, 00:14:59.375 "abort": true, 00:14:59.375 "seek_hole": false, 00:14:59.375 "seek_data": false, 00:14:59.375 "copy": true, 00:14:59.375 "nvme_iov_md": false 00:14:59.375 }, 00:14:59.375 "memory_domains": [ 00:14:59.375 { 00:14:59.375 "dma_device_id": "system", 00:14:59.375 "dma_device_type": 1 00:14:59.375 }, 00:14:59.375 { 00:14:59.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.375 "dma_device_type": 2 00:14:59.375 } 00:14:59.375 ], 00:14:59.375 "driver_specific": {} 00:14:59.375 } 00:14:59.375 ] 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.375 "name": "Existed_Raid", 00:14:59.375 "uuid": "f2606708-f039-4ab5-8fc4-7466db7baa9e", 00:14:59.375 "strip_size_kb": 64, 00:14:59.375 "state": "configuring", 00:14:59.375 "raid_level": "concat", 00:14:59.375 "superblock": true, 00:14:59.375 "num_base_bdevs": 3, 00:14:59.375 "num_base_bdevs_discovered": 2, 00:14:59.375 "num_base_bdevs_operational": 3, 00:14:59.375 "base_bdevs_list": [ 00:14:59.375 { 00:14:59.375 "name": "BaseBdev1", 00:14:59.375 "uuid": "79ffcc1a-ab60-457d-9c42-c7522a4ce1a9", 00:14:59.375 "is_configured": true, 00:14:59.375 "data_offset": 2048, 00:14:59.375 "data_size": 63488 00:14:59.375 }, 00:14:59.375 { 00:14:59.375 "name": "BaseBdev2", 00:14:59.375 "uuid": "7d800d1d-20b7-4287-ac7e-0c737f03fc79", 00:14:59.375 "is_configured": true, 00:14:59.375 "data_offset": 2048, 00:14:59.375 "data_size": 63488 00:14:59.375 }, 00:14:59.375 { 00:14:59.375 "name": "BaseBdev3", 00:14:59.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.375 "is_configured": false, 00:14:59.375 "data_offset": 0, 00:14:59.375 "data_size": 0 00:14:59.375 } 00:14:59.375 ] 00:14:59.375 }' 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.375 13:40:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:59.942 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:00.200 [2024-07-12 13:40:48.747125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:00.200 [2024-07-12 13:40:48.747283] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d27b10 00:15:00.200 [2024-07-12 13:40:48.747297] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:00.200 [2024-07-12 13:40:48.747474] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d277e0 00:15:00.200 [2024-07-12 13:40:48.747592] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d27b10 00:15:00.200 [2024-07-12 13:40:48.747603] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d27b10 00:15:00.200 [2024-07-12 13:40:48.747694] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:00.200 BaseBdev3 00:15:00.200 13:40:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:00.200 13:40:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:00.200 13:40:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:00.200 13:40:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:00.200 13:40:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:00.200 13:40:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:00.200 13:40:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:00.458 13:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:00.716 [ 00:15:00.716 { 00:15:00.716 "name": "BaseBdev3", 00:15:00.716 "aliases": [ 00:15:00.716 "113e2527-da15-4eb8-b3ae-c488c0a12352" 00:15:00.716 ], 00:15:00.716 "product_name": "Malloc disk", 00:15:00.716 "block_size": 512, 00:15:00.716 "num_blocks": 65536, 00:15:00.716 "uuid": "113e2527-da15-4eb8-b3ae-c488c0a12352", 00:15:00.716 "assigned_rate_limits": { 00:15:00.716 "rw_ios_per_sec": 0, 00:15:00.716 "rw_mbytes_per_sec": 0, 00:15:00.716 "r_mbytes_per_sec": 0, 00:15:00.716 "w_mbytes_per_sec": 0 00:15:00.716 }, 00:15:00.716 "claimed": true, 00:15:00.716 "claim_type": "exclusive_write", 00:15:00.716 "zoned": false, 00:15:00.716 "supported_io_types": { 00:15:00.716 "read": true, 00:15:00.716 "write": true, 00:15:00.716 "unmap": true, 00:15:00.716 "flush": true, 00:15:00.717 "reset": true, 00:15:00.717 "nvme_admin": false, 00:15:00.717 "nvme_io": false, 00:15:00.717 "nvme_io_md": false, 00:15:00.717 "write_zeroes": true, 00:15:00.717 "zcopy": true, 00:15:00.717 "get_zone_info": false, 00:15:00.717 "zone_management": false, 00:15:00.717 "zone_append": false, 00:15:00.717 "compare": false, 00:15:00.717 "compare_and_write": false, 00:15:00.717 "abort": true, 00:15:00.717 "seek_hole": false, 00:15:00.717 "seek_data": false, 00:15:00.717 "copy": true, 00:15:00.717 "nvme_iov_md": false 00:15:00.717 }, 00:15:00.717 "memory_domains": [ 00:15:00.717 { 00:15:00.717 "dma_device_id": "system", 00:15:00.717 "dma_device_type": 1 00:15:00.717 }, 00:15:00.717 { 00:15:00.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.717 "dma_device_type": 2 00:15:00.717 } 00:15:00.717 ], 00:15:00.717 "driver_specific": {} 00:15:00.717 } 00:15:00.717 ] 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.717 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:00.975 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:00.975 "name": "Existed_Raid", 00:15:00.975 "uuid": "f2606708-f039-4ab5-8fc4-7466db7baa9e", 00:15:00.975 "strip_size_kb": 64, 00:15:00.975 "state": "online", 00:15:00.975 "raid_level": "concat", 00:15:00.975 "superblock": true, 00:15:00.975 "num_base_bdevs": 3, 00:15:00.975 "num_base_bdevs_discovered": 3, 00:15:00.975 "num_base_bdevs_operational": 3, 00:15:00.975 "base_bdevs_list": [ 00:15:00.975 { 00:15:00.975 "name": "BaseBdev1", 00:15:00.975 "uuid": "79ffcc1a-ab60-457d-9c42-c7522a4ce1a9", 00:15:00.975 "is_configured": true, 00:15:00.975 "data_offset": 2048, 00:15:00.975 "data_size": 63488 00:15:00.975 }, 00:15:00.975 { 00:15:00.975 "name": "BaseBdev2", 00:15:00.975 "uuid": "7d800d1d-20b7-4287-ac7e-0c737f03fc79", 00:15:00.975 "is_configured": true, 00:15:00.975 "data_offset": 2048, 00:15:00.975 "data_size": 63488 00:15:00.975 }, 00:15:00.975 { 00:15:00.975 "name": "BaseBdev3", 00:15:00.975 "uuid": "113e2527-da15-4eb8-b3ae-c488c0a12352", 00:15:00.975 "is_configured": true, 00:15:00.975 "data_offset": 2048, 00:15:00.975 "data_size": 63488 00:15:00.975 } 00:15:00.975 ] 00:15:00.975 }' 00:15:00.975 13:40:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:00.975 13:40:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:01.910 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:01.910 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:01.910 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:01.910 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:01.910 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:01.910 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:01.910 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:01.910 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:01.910 [2024-07-12 13:40:50.287544] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:01.910 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:01.910 "name": "Existed_Raid", 00:15:01.910 "aliases": [ 00:15:01.910 "f2606708-f039-4ab5-8fc4-7466db7baa9e" 00:15:01.910 ], 00:15:01.910 "product_name": "Raid Volume", 00:15:01.910 "block_size": 512, 00:15:01.910 "num_blocks": 190464, 00:15:01.910 "uuid": "f2606708-f039-4ab5-8fc4-7466db7baa9e", 00:15:01.910 "assigned_rate_limits": { 00:15:01.910 "rw_ios_per_sec": 0, 00:15:01.910 "rw_mbytes_per_sec": 0, 00:15:01.910 "r_mbytes_per_sec": 0, 00:15:01.910 "w_mbytes_per_sec": 0 00:15:01.910 }, 00:15:01.910 "claimed": false, 00:15:01.911 "zoned": false, 00:15:01.911 "supported_io_types": { 00:15:01.911 "read": true, 00:15:01.911 "write": true, 00:15:01.911 "unmap": true, 00:15:01.911 "flush": true, 00:15:01.911 "reset": true, 00:15:01.911 "nvme_admin": false, 00:15:01.911 "nvme_io": false, 00:15:01.911 "nvme_io_md": false, 00:15:01.911 "write_zeroes": true, 00:15:01.911 "zcopy": false, 00:15:01.911 "get_zone_info": false, 00:15:01.911 "zone_management": false, 00:15:01.911 "zone_append": false, 00:15:01.911 "compare": false, 00:15:01.911 "compare_and_write": false, 00:15:01.911 "abort": false, 00:15:01.911 "seek_hole": false, 00:15:01.911 "seek_data": false, 00:15:01.911 "copy": false, 00:15:01.911 "nvme_iov_md": false 00:15:01.911 }, 00:15:01.911 "memory_domains": [ 00:15:01.911 { 00:15:01.911 "dma_device_id": "system", 00:15:01.911 "dma_device_type": 1 00:15:01.911 }, 00:15:01.911 { 00:15:01.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.911 "dma_device_type": 2 00:15:01.911 }, 00:15:01.911 { 00:15:01.911 "dma_device_id": "system", 00:15:01.911 "dma_device_type": 1 00:15:01.911 }, 00:15:01.911 { 00:15:01.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.911 "dma_device_type": 2 00:15:01.911 }, 00:15:01.911 { 00:15:01.911 "dma_device_id": "system", 00:15:01.911 "dma_device_type": 1 00:15:01.911 }, 00:15:01.911 { 00:15:01.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.911 "dma_device_type": 2 00:15:01.911 } 00:15:01.911 ], 00:15:01.911 "driver_specific": { 00:15:01.911 "raid": { 00:15:01.911 "uuid": "f2606708-f039-4ab5-8fc4-7466db7baa9e", 00:15:01.911 "strip_size_kb": 64, 00:15:01.911 "state": "online", 00:15:01.911 "raid_level": "concat", 00:15:01.911 "superblock": true, 00:15:01.911 "num_base_bdevs": 3, 00:15:01.911 "num_base_bdevs_discovered": 3, 00:15:01.911 "num_base_bdevs_operational": 3, 00:15:01.911 "base_bdevs_list": [ 00:15:01.911 { 00:15:01.911 "name": "BaseBdev1", 00:15:01.911 "uuid": "79ffcc1a-ab60-457d-9c42-c7522a4ce1a9", 00:15:01.911 "is_configured": true, 00:15:01.911 "data_offset": 2048, 00:15:01.911 "data_size": 63488 00:15:01.911 }, 00:15:01.911 { 00:15:01.911 "name": "BaseBdev2", 00:15:01.911 "uuid": "7d800d1d-20b7-4287-ac7e-0c737f03fc79", 00:15:01.911 "is_configured": true, 00:15:01.911 "data_offset": 2048, 00:15:01.911 "data_size": 63488 00:15:01.911 }, 00:15:01.911 { 00:15:01.911 "name": "BaseBdev3", 00:15:01.911 "uuid": "113e2527-da15-4eb8-b3ae-c488c0a12352", 00:15:01.911 "is_configured": true, 00:15:01.911 "data_offset": 2048, 00:15:01.911 "data_size": 63488 00:15:01.911 } 00:15:01.911 ] 00:15:01.911 } 00:15:01.911 } 00:15:01.911 }' 00:15:01.911 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:01.911 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:01.911 BaseBdev2 00:15:01.911 BaseBdev3' 00:15:01.911 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:01.911 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:01.911 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:02.169 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:02.169 "name": "BaseBdev1", 00:15:02.169 "aliases": [ 00:15:02.169 "79ffcc1a-ab60-457d-9c42-c7522a4ce1a9" 00:15:02.169 ], 00:15:02.169 "product_name": "Malloc disk", 00:15:02.169 "block_size": 512, 00:15:02.169 "num_blocks": 65536, 00:15:02.169 "uuid": "79ffcc1a-ab60-457d-9c42-c7522a4ce1a9", 00:15:02.169 "assigned_rate_limits": { 00:15:02.169 "rw_ios_per_sec": 0, 00:15:02.169 "rw_mbytes_per_sec": 0, 00:15:02.169 "r_mbytes_per_sec": 0, 00:15:02.169 "w_mbytes_per_sec": 0 00:15:02.169 }, 00:15:02.169 "claimed": true, 00:15:02.169 "claim_type": "exclusive_write", 00:15:02.169 "zoned": false, 00:15:02.169 "supported_io_types": { 00:15:02.169 "read": true, 00:15:02.169 "write": true, 00:15:02.169 "unmap": true, 00:15:02.169 "flush": true, 00:15:02.169 "reset": true, 00:15:02.169 "nvme_admin": false, 00:15:02.169 "nvme_io": false, 00:15:02.169 "nvme_io_md": false, 00:15:02.169 "write_zeroes": true, 00:15:02.169 "zcopy": true, 00:15:02.169 "get_zone_info": false, 00:15:02.169 "zone_management": false, 00:15:02.169 "zone_append": false, 00:15:02.169 "compare": false, 00:15:02.169 "compare_and_write": false, 00:15:02.169 "abort": true, 00:15:02.169 "seek_hole": false, 00:15:02.169 "seek_data": false, 00:15:02.169 "copy": true, 00:15:02.169 "nvme_iov_md": false 00:15:02.169 }, 00:15:02.169 "memory_domains": [ 00:15:02.169 { 00:15:02.169 "dma_device_id": "system", 00:15:02.169 "dma_device_type": 1 00:15:02.169 }, 00:15:02.169 { 00:15:02.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.169 "dma_device_type": 2 00:15:02.169 } 00:15:02.169 ], 00:15:02.169 "driver_specific": {} 00:15:02.169 }' 00:15:02.169 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.169 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.169 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:02.169 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.169 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.427 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:02.427 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.427 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.427 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:02.427 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.427 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.427 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:02.427 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:02.427 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:02.427 13:40:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:02.685 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:02.685 "name": "BaseBdev2", 00:15:02.685 "aliases": [ 00:15:02.685 "7d800d1d-20b7-4287-ac7e-0c737f03fc79" 00:15:02.685 ], 00:15:02.685 "product_name": "Malloc disk", 00:15:02.685 "block_size": 512, 00:15:02.685 "num_blocks": 65536, 00:15:02.685 "uuid": "7d800d1d-20b7-4287-ac7e-0c737f03fc79", 00:15:02.685 "assigned_rate_limits": { 00:15:02.685 "rw_ios_per_sec": 0, 00:15:02.685 "rw_mbytes_per_sec": 0, 00:15:02.685 "r_mbytes_per_sec": 0, 00:15:02.685 "w_mbytes_per_sec": 0 00:15:02.685 }, 00:15:02.685 "claimed": true, 00:15:02.685 "claim_type": "exclusive_write", 00:15:02.685 "zoned": false, 00:15:02.685 "supported_io_types": { 00:15:02.685 "read": true, 00:15:02.685 "write": true, 00:15:02.685 "unmap": true, 00:15:02.685 "flush": true, 00:15:02.685 "reset": true, 00:15:02.685 "nvme_admin": false, 00:15:02.685 "nvme_io": false, 00:15:02.685 "nvme_io_md": false, 00:15:02.685 "write_zeroes": true, 00:15:02.685 "zcopy": true, 00:15:02.685 "get_zone_info": false, 00:15:02.685 "zone_management": false, 00:15:02.685 "zone_append": false, 00:15:02.685 "compare": false, 00:15:02.685 "compare_and_write": false, 00:15:02.685 "abort": true, 00:15:02.685 "seek_hole": false, 00:15:02.685 "seek_data": false, 00:15:02.685 "copy": true, 00:15:02.685 "nvme_iov_md": false 00:15:02.685 }, 00:15:02.685 "memory_domains": [ 00:15:02.685 { 00:15:02.685 "dma_device_id": "system", 00:15:02.685 "dma_device_type": 1 00:15:02.685 }, 00:15:02.685 { 00:15:02.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.685 "dma_device_type": 2 00:15:02.685 } 00:15:02.685 ], 00:15:02.685 "driver_specific": {} 00:15:02.685 }' 00:15:02.685 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.685 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:02.943 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:02.943 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.943 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:02.943 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:02.943 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.943 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:02.943 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:02.943 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:02.943 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.201 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:03.201 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:03.201 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:03.201 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:03.460 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:03.460 "name": "BaseBdev3", 00:15:03.460 "aliases": [ 00:15:03.460 "113e2527-da15-4eb8-b3ae-c488c0a12352" 00:15:03.460 ], 00:15:03.460 "product_name": "Malloc disk", 00:15:03.460 "block_size": 512, 00:15:03.460 "num_blocks": 65536, 00:15:03.460 "uuid": "113e2527-da15-4eb8-b3ae-c488c0a12352", 00:15:03.460 "assigned_rate_limits": { 00:15:03.460 "rw_ios_per_sec": 0, 00:15:03.460 "rw_mbytes_per_sec": 0, 00:15:03.460 "r_mbytes_per_sec": 0, 00:15:03.460 "w_mbytes_per_sec": 0 00:15:03.460 }, 00:15:03.460 "claimed": true, 00:15:03.460 "claim_type": "exclusive_write", 00:15:03.460 "zoned": false, 00:15:03.460 "supported_io_types": { 00:15:03.460 "read": true, 00:15:03.460 "write": true, 00:15:03.460 "unmap": true, 00:15:03.460 "flush": true, 00:15:03.460 "reset": true, 00:15:03.460 "nvme_admin": false, 00:15:03.460 "nvme_io": false, 00:15:03.460 "nvme_io_md": false, 00:15:03.460 "write_zeroes": true, 00:15:03.460 "zcopy": true, 00:15:03.460 "get_zone_info": false, 00:15:03.460 "zone_management": false, 00:15:03.460 "zone_append": false, 00:15:03.460 "compare": false, 00:15:03.460 "compare_and_write": false, 00:15:03.460 "abort": true, 00:15:03.460 "seek_hole": false, 00:15:03.460 "seek_data": false, 00:15:03.460 "copy": true, 00:15:03.460 "nvme_iov_md": false 00:15:03.460 }, 00:15:03.460 "memory_domains": [ 00:15:03.460 { 00:15:03.460 "dma_device_id": "system", 00:15:03.460 "dma_device_type": 1 00:15:03.460 }, 00:15:03.460 { 00:15:03.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.460 "dma_device_type": 2 00:15:03.460 } 00:15:03.460 ], 00:15:03.460 "driver_specific": {} 00:15:03.460 }' 00:15:03.460 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.460 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:03.460 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:03.460 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.460 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:03.460 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:03.460 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.460 13:40:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:03.460 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:03.718 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.718 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:03.718 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:03.718 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:03.977 [2024-07-12 13:40:52.352780] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:03.977 [2024-07-12 13:40:52.352805] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:03.977 [2024-07-12 13:40:52.352845] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.977 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:04.236 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:04.236 "name": "Existed_Raid", 00:15:04.236 "uuid": "f2606708-f039-4ab5-8fc4-7466db7baa9e", 00:15:04.236 "strip_size_kb": 64, 00:15:04.236 "state": "offline", 00:15:04.236 "raid_level": "concat", 00:15:04.236 "superblock": true, 00:15:04.236 "num_base_bdevs": 3, 00:15:04.236 "num_base_bdevs_discovered": 2, 00:15:04.236 "num_base_bdevs_operational": 2, 00:15:04.236 "base_bdevs_list": [ 00:15:04.236 { 00:15:04.236 "name": null, 00:15:04.236 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.236 "is_configured": false, 00:15:04.236 "data_offset": 2048, 00:15:04.236 "data_size": 63488 00:15:04.236 }, 00:15:04.236 { 00:15:04.236 "name": "BaseBdev2", 00:15:04.236 "uuid": "7d800d1d-20b7-4287-ac7e-0c737f03fc79", 00:15:04.236 "is_configured": true, 00:15:04.236 "data_offset": 2048, 00:15:04.236 "data_size": 63488 00:15:04.236 }, 00:15:04.236 { 00:15:04.236 "name": "BaseBdev3", 00:15:04.236 "uuid": "113e2527-da15-4eb8-b3ae-c488c0a12352", 00:15:04.236 "is_configured": true, 00:15:04.236 "data_offset": 2048, 00:15:04.236 "data_size": 63488 00:15:04.236 } 00:15:04.236 ] 00:15:04.236 }' 00:15:04.236 13:40:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:04.236 13:40:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:04.801 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:04.801 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:04.801 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.801 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:05.058 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:05.058 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:05.058 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:05.316 [2024-07-12 13:40:53.645238] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:05.316 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:05.316 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:05.316 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.316 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:05.574 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:05.574 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:05.574 13:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:05.832 [2024-07-12 13:40:54.157154] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:05.832 [2024-07-12 13:40:54.157196] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d27b10 name Existed_Raid, state offline 00:15:05.832 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:05.832 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:05.832 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.832 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:06.089 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:06.089 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:06.089 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:06.089 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:06.089 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:06.089 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:06.089 BaseBdev2 00:15:06.347 13:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:06.347 13:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:06.347 13:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:06.347 13:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:06.347 13:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:06.347 13:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:06.347 13:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:06.347 13:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:06.606 [ 00:15:06.606 { 00:15:06.606 "name": "BaseBdev2", 00:15:06.606 "aliases": [ 00:15:06.606 "a36a1874-63c4-40ff-87ea-67ce72a3d72c" 00:15:06.606 ], 00:15:06.606 "product_name": "Malloc disk", 00:15:06.606 "block_size": 512, 00:15:06.606 "num_blocks": 65536, 00:15:06.606 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:06.606 "assigned_rate_limits": { 00:15:06.606 "rw_ios_per_sec": 0, 00:15:06.606 "rw_mbytes_per_sec": 0, 00:15:06.606 "r_mbytes_per_sec": 0, 00:15:06.606 "w_mbytes_per_sec": 0 00:15:06.606 }, 00:15:06.606 "claimed": false, 00:15:06.606 "zoned": false, 00:15:06.606 "supported_io_types": { 00:15:06.606 "read": true, 00:15:06.606 "write": true, 00:15:06.606 "unmap": true, 00:15:06.606 "flush": true, 00:15:06.606 "reset": true, 00:15:06.606 "nvme_admin": false, 00:15:06.606 "nvme_io": false, 00:15:06.606 "nvme_io_md": false, 00:15:06.606 "write_zeroes": true, 00:15:06.606 "zcopy": true, 00:15:06.606 "get_zone_info": false, 00:15:06.606 "zone_management": false, 00:15:06.606 "zone_append": false, 00:15:06.606 "compare": false, 00:15:06.606 "compare_and_write": false, 00:15:06.606 "abort": true, 00:15:06.606 "seek_hole": false, 00:15:06.606 "seek_data": false, 00:15:06.606 "copy": true, 00:15:06.606 "nvme_iov_md": false 00:15:06.606 }, 00:15:06.606 "memory_domains": [ 00:15:06.606 { 00:15:06.606 "dma_device_id": "system", 00:15:06.606 "dma_device_type": 1 00:15:06.606 }, 00:15:06.606 { 00:15:06.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.606 "dma_device_type": 2 00:15:06.606 } 00:15:06.606 ], 00:15:06.606 "driver_specific": {} 00:15:06.606 } 00:15:06.606 ] 00:15:06.606 13:40:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:06.606 13:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:06.606 13:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:06.606 13:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:06.866 BaseBdev3 00:15:06.866 13:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:06.866 13:40:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:06.866 13:40:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:06.866 13:40:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:06.866 13:40:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:06.866 13:40:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:06.866 13:40:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:07.124 13:40:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:07.382 [ 00:15:07.382 { 00:15:07.382 "name": "BaseBdev3", 00:15:07.382 "aliases": [ 00:15:07.382 "2e3b25e6-e8e9-433a-9a20-344097c2ce21" 00:15:07.382 ], 00:15:07.382 "product_name": "Malloc disk", 00:15:07.382 "block_size": 512, 00:15:07.382 "num_blocks": 65536, 00:15:07.382 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:07.382 "assigned_rate_limits": { 00:15:07.382 "rw_ios_per_sec": 0, 00:15:07.382 "rw_mbytes_per_sec": 0, 00:15:07.382 "r_mbytes_per_sec": 0, 00:15:07.382 "w_mbytes_per_sec": 0 00:15:07.382 }, 00:15:07.382 "claimed": false, 00:15:07.382 "zoned": false, 00:15:07.382 "supported_io_types": { 00:15:07.382 "read": true, 00:15:07.382 "write": true, 00:15:07.382 "unmap": true, 00:15:07.382 "flush": true, 00:15:07.382 "reset": true, 00:15:07.382 "nvme_admin": false, 00:15:07.382 "nvme_io": false, 00:15:07.382 "nvme_io_md": false, 00:15:07.382 "write_zeroes": true, 00:15:07.382 "zcopy": true, 00:15:07.382 "get_zone_info": false, 00:15:07.382 "zone_management": false, 00:15:07.382 "zone_append": false, 00:15:07.382 "compare": false, 00:15:07.382 "compare_and_write": false, 00:15:07.382 "abort": true, 00:15:07.382 "seek_hole": false, 00:15:07.382 "seek_data": false, 00:15:07.382 "copy": true, 00:15:07.382 "nvme_iov_md": false 00:15:07.382 }, 00:15:07.382 "memory_domains": [ 00:15:07.382 { 00:15:07.382 "dma_device_id": "system", 00:15:07.382 "dma_device_type": 1 00:15:07.382 }, 00:15:07.382 { 00:15:07.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:07.382 "dma_device_type": 2 00:15:07.382 } 00:15:07.382 ], 00:15:07.382 "driver_specific": {} 00:15:07.382 } 00:15:07.382 ] 00:15:07.382 13:40:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:07.382 13:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:07.382 13:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:07.382 13:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:07.642 [2024-07-12 13:40:56.155888] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:07.642 [2024-07-12 13:40:56.155937] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:07.642 [2024-07-12 13:40:56.155962] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:07.642 [2024-07-12 13:40:56.157345] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.642 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.901 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.901 "name": "Existed_Raid", 00:15:07.901 "uuid": "49442895-753d-4f66-9cdf-e8aa214466cb", 00:15:07.901 "strip_size_kb": 64, 00:15:07.901 "state": "configuring", 00:15:07.901 "raid_level": "concat", 00:15:07.901 "superblock": true, 00:15:07.901 "num_base_bdevs": 3, 00:15:07.901 "num_base_bdevs_discovered": 2, 00:15:07.901 "num_base_bdevs_operational": 3, 00:15:07.901 "base_bdevs_list": [ 00:15:07.901 { 00:15:07.901 "name": "BaseBdev1", 00:15:07.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.901 "is_configured": false, 00:15:07.901 "data_offset": 0, 00:15:07.901 "data_size": 0 00:15:07.901 }, 00:15:07.901 { 00:15:07.901 "name": "BaseBdev2", 00:15:07.901 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:07.901 "is_configured": true, 00:15:07.901 "data_offset": 2048, 00:15:07.901 "data_size": 63488 00:15:07.901 }, 00:15:07.901 { 00:15:07.901 "name": "BaseBdev3", 00:15:07.901 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:07.901 "is_configured": true, 00:15:07.901 "data_offset": 2048, 00:15:07.901 "data_size": 63488 00:15:07.901 } 00:15:07.901 ] 00:15:07.901 }' 00:15:07.901 13:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.901 13:40:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:08.467 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:08.726 [2024-07-12 13:40:57.234737] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.726 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:08.985 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:08.985 "name": "Existed_Raid", 00:15:08.985 "uuid": "49442895-753d-4f66-9cdf-e8aa214466cb", 00:15:08.985 "strip_size_kb": 64, 00:15:08.985 "state": "configuring", 00:15:08.985 "raid_level": "concat", 00:15:08.985 "superblock": true, 00:15:08.985 "num_base_bdevs": 3, 00:15:08.985 "num_base_bdevs_discovered": 1, 00:15:08.985 "num_base_bdevs_operational": 3, 00:15:08.985 "base_bdevs_list": [ 00:15:08.985 { 00:15:08.985 "name": "BaseBdev1", 00:15:08.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.985 "is_configured": false, 00:15:08.985 "data_offset": 0, 00:15:08.985 "data_size": 0 00:15:08.985 }, 00:15:08.985 { 00:15:08.985 "name": null, 00:15:08.985 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:08.985 "is_configured": false, 00:15:08.985 "data_offset": 2048, 00:15:08.985 "data_size": 63488 00:15:08.985 }, 00:15:08.985 { 00:15:08.985 "name": "BaseBdev3", 00:15:08.985 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:08.985 "is_configured": true, 00:15:08.985 "data_offset": 2048, 00:15:08.985 "data_size": 63488 00:15:08.985 } 00:15:08.985 ] 00:15:08.985 }' 00:15:08.985 13:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:08.985 13:40:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:09.551 13:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.551 13:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:09.809 13:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:09.809 13:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:10.067 [2024-07-12 13:40:58.554003] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:10.067 BaseBdev1 00:15:10.067 13:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:10.067 13:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:10.067 13:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:10.067 13:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:10.067 13:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:10.067 13:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:10.068 13:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:10.325 13:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:10.583 [ 00:15:10.583 { 00:15:10.583 "name": "BaseBdev1", 00:15:10.583 "aliases": [ 00:15:10.583 "c0f47c9e-b31d-4976-8bf7-76eb9661a51e" 00:15:10.583 ], 00:15:10.583 "product_name": "Malloc disk", 00:15:10.583 "block_size": 512, 00:15:10.583 "num_blocks": 65536, 00:15:10.583 "uuid": "c0f47c9e-b31d-4976-8bf7-76eb9661a51e", 00:15:10.583 "assigned_rate_limits": { 00:15:10.583 "rw_ios_per_sec": 0, 00:15:10.583 "rw_mbytes_per_sec": 0, 00:15:10.583 "r_mbytes_per_sec": 0, 00:15:10.584 "w_mbytes_per_sec": 0 00:15:10.584 }, 00:15:10.584 "claimed": true, 00:15:10.584 "claim_type": "exclusive_write", 00:15:10.584 "zoned": false, 00:15:10.584 "supported_io_types": { 00:15:10.584 "read": true, 00:15:10.584 "write": true, 00:15:10.584 "unmap": true, 00:15:10.584 "flush": true, 00:15:10.584 "reset": true, 00:15:10.584 "nvme_admin": false, 00:15:10.584 "nvme_io": false, 00:15:10.584 "nvme_io_md": false, 00:15:10.584 "write_zeroes": true, 00:15:10.584 "zcopy": true, 00:15:10.584 "get_zone_info": false, 00:15:10.584 "zone_management": false, 00:15:10.584 "zone_append": false, 00:15:10.584 "compare": false, 00:15:10.584 "compare_and_write": false, 00:15:10.584 "abort": true, 00:15:10.584 "seek_hole": false, 00:15:10.584 "seek_data": false, 00:15:10.584 "copy": true, 00:15:10.584 "nvme_iov_md": false 00:15:10.584 }, 00:15:10.584 "memory_domains": [ 00:15:10.584 { 00:15:10.584 "dma_device_id": "system", 00:15:10.584 "dma_device_type": 1 00:15:10.584 }, 00:15:10.584 { 00:15:10.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.584 "dma_device_type": 2 00:15:10.584 } 00:15:10.584 ], 00:15:10.584 "driver_specific": {} 00:15:10.584 } 00:15:10.584 ] 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.584 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.842 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.842 "name": "Existed_Raid", 00:15:10.842 "uuid": "49442895-753d-4f66-9cdf-e8aa214466cb", 00:15:10.842 "strip_size_kb": 64, 00:15:10.842 "state": "configuring", 00:15:10.842 "raid_level": "concat", 00:15:10.842 "superblock": true, 00:15:10.842 "num_base_bdevs": 3, 00:15:10.842 "num_base_bdevs_discovered": 2, 00:15:10.842 "num_base_bdevs_operational": 3, 00:15:10.842 "base_bdevs_list": [ 00:15:10.842 { 00:15:10.842 "name": "BaseBdev1", 00:15:10.842 "uuid": "c0f47c9e-b31d-4976-8bf7-76eb9661a51e", 00:15:10.842 "is_configured": true, 00:15:10.842 "data_offset": 2048, 00:15:10.842 "data_size": 63488 00:15:10.842 }, 00:15:10.842 { 00:15:10.842 "name": null, 00:15:10.842 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:10.842 "is_configured": false, 00:15:10.842 "data_offset": 2048, 00:15:10.842 "data_size": 63488 00:15:10.842 }, 00:15:10.842 { 00:15:10.842 "name": "BaseBdev3", 00:15:10.842 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:10.842 "is_configured": true, 00:15:10.842 "data_offset": 2048, 00:15:10.842 "data_size": 63488 00:15:10.842 } 00:15:10.842 ] 00:15:10.842 }' 00:15:10.842 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.842 13:40:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.407 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.407 13:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:11.666 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:11.666 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:11.925 [2024-07-12 13:41:00.278625] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.925 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:12.183 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:12.183 "name": "Existed_Raid", 00:15:12.183 "uuid": "49442895-753d-4f66-9cdf-e8aa214466cb", 00:15:12.183 "strip_size_kb": 64, 00:15:12.183 "state": "configuring", 00:15:12.183 "raid_level": "concat", 00:15:12.183 "superblock": true, 00:15:12.183 "num_base_bdevs": 3, 00:15:12.183 "num_base_bdevs_discovered": 1, 00:15:12.183 "num_base_bdevs_operational": 3, 00:15:12.183 "base_bdevs_list": [ 00:15:12.183 { 00:15:12.183 "name": "BaseBdev1", 00:15:12.183 "uuid": "c0f47c9e-b31d-4976-8bf7-76eb9661a51e", 00:15:12.183 "is_configured": true, 00:15:12.183 "data_offset": 2048, 00:15:12.183 "data_size": 63488 00:15:12.183 }, 00:15:12.183 { 00:15:12.183 "name": null, 00:15:12.183 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:12.183 "is_configured": false, 00:15:12.183 "data_offset": 2048, 00:15:12.183 "data_size": 63488 00:15:12.183 }, 00:15:12.183 { 00:15:12.183 "name": null, 00:15:12.183 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:12.183 "is_configured": false, 00:15:12.183 "data_offset": 2048, 00:15:12.183 "data_size": 63488 00:15:12.183 } 00:15:12.183 ] 00:15:12.183 }' 00:15:12.183 13:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:12.183 13:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:13.119 13:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.119 13:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:13.377 13:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:13.377 13:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:13.635 [2024-07-12 13:41:01.995194] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.635 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.892 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.892 "name": "Existed_Raid", 00:15:13.892 "uuid": "49442895-753d-4f66-9cdf-e8aa214466cb", 00:15:13.892 "strip_size_kb": 64, 00:15:13.892 "state": "configuring", 00:15:13.892 "raid_level": "concat", 00:15:13.892 "superblock": true, 00:15:13.892 "num_base_bdevs": 3, 00:15:13.892 "num_base_bdevs_discovered": 2, 00:15:13.892 "num_base_bdevs_operational": 3, 00:15:13.892 "base_bdevs_list": [ 00:15:13.892 { 00:15:13.892 "name": "BaseBdev1", 00:15:13.892 "uuid": "c0f47c9e-b31d-4976-8bf7-76eb9661a51e", 00:15:13.892 "is_configured": true, 00:15:13.892 "data_offset": 2048, 00:15:13.892 "data_size": 63488 00:15:13.892 }, 00:15:13.892 { 00:15:13.892 "name": null, 00:15:13.892 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:13.892 "is_configured": false, 00:15:13.892 "data_offset": 2048, 00:15:13.892 "data_size": 63488 00:15:13.892 }, 00:15:13.892 { 00:15:13.892 "name": "BaseBdev3", 00:15:13.892 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:13.892 "is_configured": true, 00:15:13.892 "data_offset": 2048, 00:15:13.892 "data_size": 63488 00:15:13.892 } 00:15:13.892 ] 00:15:13.892 }' 00:15:13.892 13:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.892 13:41:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:14.861 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:14.861 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.861 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:14.861 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:15.119 [2024-07-12 13:41:03.631546] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.119 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:15.375 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.375 "name": "Existed_Raid", 00:15:15.375 "uuid": "49442895-753d-4f66-9cdf-e8aa214466cb", 00:15:15.375 "strip_size_kb": 64, 00:15:15.375 "state": "configuring", 00:15:15.375 "raid_level": "concat", 00:15:15.375 "superblock": true, 00:15:15.375 "num_base_bdevs": 3, 00:15:15.375 "num_base_bdevs_discovered": 1, 00:15:15.375 "num_base_bdevs_operational": 3, 00:15:15.375 "base_bdevs_list": [ 00:15:15.375 { 00:15:15.375 "name": null, 00:15:15.375 "uuid": "c0f47c9e-b31d-4976-8bf7-76eb9661a51e", 00:15:15.375 "is_configured": false, 00:15:15.375 "data_offset": 2048, 00:15:15.375 "data_size": 63488 00:15:15.375 }, 00:15:15.375 { 00:15:15.375 "name": null, 00:15:15.375 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:15.375 "is_configured": false, 00:15:15.375 "data_offset": 2048, 00:15:15.375 "data_size": 63488 00:15:15.375 }, 00:15:15.375 { 00:15:15.375 "name": "BaseBdev3", 00:15:15.375 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:15.375 "is_configured": true, 00:15:15.375 "data_offset": 2048, 00:15:15.375 "data_size": 63488 00:15:15.375 } 00:15:15.375 ] 00:15:15.375 }' 00:15:15.375 13:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.375 13:41:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:16.309 13:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.309 13:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:16.567 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:16.567 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:16.826 [2024-07-12 13:41:05.246641] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.826 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.085 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.085 "name": "Existed_Raid", 00:15:17.085 "uuid": "49442895-753d-4f66-9cdf-e8aa214466cb", 00:15:17.085 "strip_size_kb": 64, 00:15:17.085 "state": "configuring", 00:15:17.085 "raid_level": "concat", 00:15:17.085 "superblock": true, 00:15:17.085 "num_base_bdevs": 3, 00:15:17.085 "num_base_bdevs_discovered": 2, 00:15:17.085 "num_base_bdevs_operational": 3, 00:15:17.085 "base_bdevs_list": [ 00:15:17.085 { 00:15:17.085 "name": null, 00:15:17.085 "uuid": "c0f47c9e-b31d-4976-8bf7-76eb9661a51e", 00:15:17.085 "is_configured": false, 00:15:17.085 "data_offset": 2048, 00:15:17.085 "data_size": 63488 00:15:17.085 }, 00:15:17.086 { 00:15:17.086 "name": "BaseBdev2", 00:15:17.086 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:17.086 "is_configured": true, 00:15:17.086 "data_offset": 2048, 00:15:17.086 "data_size": 63488 00:15:17.086 }, 00:15:17.086 { 00:15:17.086 "name": "BaseBdev3", 00:15:17.086 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:17.086 "is_configured": true, 00:15:17.086 "data_offset": 2048, 00:15:17.086 "data_size": 63488 00:15:17.086 } 00:15:17.086 ] 00:15:17.086 }' 00:15:17.086 13:41:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.086 13:41:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:18.020 13:41:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.020 13:41:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:18.279 13:41:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:18.279 13:41:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.279 13:41:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:18.537 13:41:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c0f47c9e-b31d-4976-8bf7-76eb9661a51e 00:15:18.796 [2024-07-12 13:41:07.159153] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:18.796 [2024-07-12 13:41:07.159302] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d25e90 00:15:18.796 [2024-07-12 13:41:07.159316] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:18.796 [2024-07-12 13:41:07.159489] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c41cd0 00:15:18.796 [2024-07-12 13:41:07.159601] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d25e90 00:15:18.796 [2024-07-12 13:41:07.159611] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d25e90 00:15:18.796 [2024-07-12 13:41:07.159699] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:18.796 NewBaseBdev 00:15:18.796 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:18.796 13:41:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:18.796 13:41:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:18.796 13:41:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:18.796 13:41:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:18.796 13:41:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:18.796 13:41:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:19.055 13:41:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:19.314 [ 00:15:19.314 { 00:15:19.314 "name": "NewBaseBdev", 00:15:19.314 "aliases": [ 00:15:19.314 "c0f47c9e-b31d-4976-8bf7-76eb9661a51e" 00:15:19.314 ], 00:15:19.314 "product_name": "Malloc disk", 00:15:19.314 "block_size": 512, 00:15:19.314 "num_blocks": 65536, 00:15:19.314 "uuid": "c0f47c9e-b31d-4976-8bf7-76eb9661a51e", 00:15:19.314 "assigned_rate_limits": { 00:15:19.314 "rw_ios_per_sec": 0, 00:15:19.314 "rw_mbytes_per_sec": 0, 00:15:19.314 "r_mbytes_per_sec": 0, 00:15:19.314 "w_mbytes_per_sec": 0 00:15:19.314 }, 00:15:19.314 "claimed": true, 00:15:19.314 "claim_type": "exclusive_write", 00:15:19.314 "zoned": false, 00:15:19.314 "supported_io_types": { 00:15:19.314 "read": true, 00:15:19.314 "write": true, 00:15:19.314 "unmap": true, 00:15:19.314 "flush": true, 00:15:19.314 "reset": true, 00:15:19.314 "nvme_admin": false, 00:15:19.314 "nvme_io": false, 00:15:19.314 "nvme_io_md": false, 00:15:19.314 "write_zeroes": true, 00:15:19.314 "zcopy": true, 00:15:19.314 "get_zone_info": false, 00:15:19.314 "zone_management": false, 00:15:19.314 "zone_append": false, 00:15:19.314 "compare": false, 00:15:19.314 "compare_and_write": false, 00:15:19.314 "abort": true, 00:15:19.314 "seek_hole": false, 00:15:19.314 "seek_data": false, 00:15:19.314 "copy": true, 00:15:19.314 "nvme_iov_md": false 00:15:19.314 }, 00:15:19.314 "memory_domains": [ 00:15:19.314 { 00:15:19.314 "dma_device_id": "system", 00:15:19.314 "dma_device_type": 1 00:15:19.314 }, 00:15:19.314 { 00:15:19.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.314 "dma_device_type": 2 00:15:19.314 } 00:15:19.314 ], 00:15:19.314 "driver_specific": {} 00:15:19.314 } 00:15:19.314 ] 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.314 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.573 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.573 "name": "Existed_Raid", 00:15:19.573 "uuid": "49442895-753d-4f66-9cdf-e8aa214466cb", 00:15:19.573 "strip_size_kb": 64, 00:15:19.573 "state": "online", 00:15:19.573 "raid_level": "concat", 00:15:19.573 "superblock": true, 00:15:19.573 "num_base_bdevs": 3, 00:15:19.573 "num_base_bdevs_discovered": 3, 00:15:19.573 "num_base_bdevs_operational": 3, 00:15:19.573 "base_bdevs_list": [ 00:15:19.573 { 00:15:19.573 "name": "NewBaseBdev", 00:15:19.573 "uuid": "c0f47c9e-b31d-4976-8bf7-76eb9661a51e", 00:15:19.573 "is_configured": true, 00:15:19.573 "data_offset": 2048, 00:15:19.573 "data_size": 63488 00:15:19.573 }, 00:15:19.573 { 00:15:19.573 "name": "BaseBdev2", 00:15:19.573 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:19.573 "is_configured": true, 00:15:19.573 "data_offset": 2048, 00:15:19.573 "data_size": 63488 00:15:19.573 }, 00:15:19.573 { 00:15:19.573 "name": "BaseBdev3", 00:15:19.573 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:19.573 "is_configured": true, 00:15:19.573 "data_offset": 2048, 00:15:19.573 "data_size": 63488 00:15:19.573 } 00:15:19.573 ] 00:15:19.573 }' 00:15:19.573 13:41:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.573 13:41:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:20.140 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:20.140 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:20.140 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:20.140 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:20.140 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:20.140 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:20.140 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:20.140 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:20.399 [2024-07-12 13:41:08.747675] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:20.399 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:20.399 "name": "Existed_Raid", 00:15:20.399 "aliases": [ 00:15:20.399 "49442895-753d-4f66-9cdf-e8aa214466cb" 00:15:20.399 ], 00:15:20.399 "product_name": "Raid Volume", 00:15:20.399 "block_size": 512, 00:15:20.399 "num_blocks": 190464, 00:15:20.399 "uuid": "49442895-753d-4f66-9cdf-e8aa214466cb", 00:15:20.399 "assigned_rate_limits": { 00:15:20.399 "rw_ios_per_sec": 0, 00:15:20.399 "rw_mbytes_per_sec": 0, 00:15:20.399 "r_mbytes_per_sec": 0, 00:15:20.399 "w_mbytes_per_sec": 0 00:15:20.399 }, 00:15:20.399 "claimed": false, 00:15:20.399 "zoned": false, 00:15:20.399 "supported_io_types": { 00:15:20.399 "read": true, 00:15:20.399 "write": true, 00:15:20.399 "unmap": true, 00:15:20.399 "flush": true, 00:15:20.399 "reset": true, 00:15:20.399 "nvme_admin": false, 00:15:20.399 "nvme_io": false, 00:15:20.399 "nvme_io_md": false, 00:15:20.399 "write_zeroes": true, 00:15:20.399 "zcopy": false, 00:15:20.399 "get_zone_info": false, 00:15:20.399 "zone_management": false, 00:15:20.399 "zone_append": false, 00:15:20.399 "compare": false, 00:15:20.399 "compare_and_write": false, 00:15:20.399 "abort": false, 00:15:20.399 "seek_hole": false, 00:15:20.399 "seek_data": false, 00:15:20.399 "copy": false, 00:15:20.399 "nvme_iov_md": false 00:15:20.399 }, 00:15:20.399 "memory_domains": [ 00:15:20.399 { 00:15:20.399 "dma_device_id": "system", 00:15:20.399 "dma_device_type": 1 00:15:20.399 }, 00:15:20.399 { 00:15:20.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.399 "dma_device_type": 2 00:15:20.399 }, 00:15:20.399 { 00:15:20.399 "dma_device_id": "system", 00:15:20.399 "dma_device_type": 1 00:15:20.399 }, 00:15:20.399 { 00:15:20.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.399 "dma_device_type": 2 00:15:20.399 }, 00:15:20.399 { 00:15:20.399 "dma_device_id": "system", 00:15:20.399 "dma_device_type": 1 00:15:20.399 }, 00:15:20.399 { 00:15:20.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.399 "dma_device_type": 2 00:15:20.399 } 00:15:20.399 ], 00:15:20.399 "driver_specific": { 00:15:20.399 "raid": { 00:15:20.399 "uuid": "49442895-753d-4f66-9cdf-e8aa214466cb", 00:15:20.399 "strip_size_kb": 64, 00:15:20.399 "state": "online", 00:15:20.399 "raid_level": "concat", 00:15:20.399 "superblock": true, 00:15:20.399 "num_base_bdevs": 3, 00:15:20.399 "num_base_bdevs_discovered": 3, 00:15:20.399 "num_base_bdevs_operational": 3, 00:15:20.399 "base_bdevs_list": [ 00:15:20.399 { 00:15:20.399 "name": "NewBaseBdev", 00:15:20.399 "uuid": "c0f47c9e-b31d-4976-8bf7-76eb9661a51e", 00:15:20.399 "is_configured": true, 00:15:20.399 "data_offset": 2048, 00:15:20.399 "data_size": 63488 00:15:20.399 }, 00:15:20.399 { 00:15:20.399 "name": "BaseBdev2", 00:15:20.399 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:20.399 "is_configured": true, 00:15:20.399 "data_offset": 2048, 00:15:20.399 "data_size": 63488 00:15:20.399 }, 00:15:20.399 { 00:15:20.399 "name": "BaseBdev3", 00:15:20.399 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:20.399 "is_configured": true, 00:15:20.399 "data_offset": 2048, 00:15:20.399 "data_size": 63488 00:15:20.399 } 00:15:20.399 ] 00:15:20.399 } 00:15:20.399 } 00:15:20.399 }' 00:15:20.399 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:20.399 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:20.399 BaseBdev2 00:15:20.399 BaseBdev3' 00:15:20.399 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:20.399 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:20.399 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:20.658 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:20.658 "name": "NewBaseBdev", 00:15:20.658 "aliases": [ 00:15:20.658 "c0f47c9e-b31d-4976-8bf7-76eb9661a51e" 00:15:20.658 ], 00:15:20.658 "product_name": "Malloc disk", 00:15:20.658 "block_size": 512, 00:15:20.658 "num_blocks": 65536, 00:15:20.658 "uuid": "c0f47c9e-b31d-4976-8bf7-76eb9661a51e", 00:15:20.658 "assigned_rate_limits": { 00:15:20.658 "rw_ios_per_sec": 0, 00:15:20.658 "rw_mbytes_per_sec": 0, 00:15:20.658 "r_mbytes_per_sec": 0, 00:15:20.658 "w_mbytes_per_sec": 0 00:15:20.658 }, 00:15:20.658 "claimed": true, 00:15:20.658 "claim_type": "exclusive_write", 00:15:20.658 "zoned": false, 00:15:20.658 "supported_io_types": { 00:15:20.658 "read": true, 00:15:20.658 "write": true, 00:15:20.658 "unmap": true, 00:15:20.658 "flush": true, 00:15:20.658 "reset": true, 00:15:20.658 "nvme_admin": false, 00:15:20.658 "nvme_io": false, 00:15:20.658 "nvme_io_md": false, 00:15:20.658 "write_zeroes": true, 00:15:20.658 "zcopy": true, 00:15:20.658 "get_zone_info": false, 00:15:20.658 "zone_management": false, 00:15:20.658 "zone_append": false, 00:15:20.658 "compare": false, 00:15:20.658 "compare_and_write": false, 00:15:20.658 "abort": true, 00:15:20.658 "seek_hole": false, 00:15:20.658 "seek_data": false, 00:15:20.658 "copy": true, 00:15:20.658 "nvme_iov_md": false 00:15:20.658 }, 00:15:20.658 "memory_domains": [ 00:15:20.658 { 00:15:20.658 "dma_device_id": "system", 00:15:20.658 "dma_device_type": 1 00:15:20.658 }, 00:15:20.658 { 00:15:20.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.658 "dma_device_type": 2 00:15:20.658 } 00:15:20.658 ], 00:15:20.658 "driver_specific": {} 00:15:20.658 }' 00:15:20.658 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:20.658 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:20.658 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:20.658 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:20.658 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:20.917 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:20.917 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.917 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.917 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:20.917 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.917 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.917 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:20.917 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:20.917 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:20.917 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:21.175 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:21.175 "name": "BaseBdev2", 00:15:21.175 "aliases": [ 00:15:21.175 "a36a1874-63c4-40ff-87ea-67ce72a3d72c" 00:15:21.175 ], 00:15:21.175 "product_name": "Malloc disk", 00:15:21.175 "block_size": 512, 00:15:21.175 "num_blocks": 65536, 00:15:21.175 "uuid": "a36a1874-63c4-40ff-87ea-67ce72a3d72c", 00:15:21.175 "assigned_rate_limits": { 00:15:21.175 "rw_ios_per_sec": 0, 00:15:21.175 "rw_mbytes_per_sec": 0, 00:15:21.175 "r_mbytes_per_sec": 0, 00:15:21.175 "w_mbytes_per_sec": 0 00:15:21.175 }, 00:15:21.175 "claimed": true, 00:15:21.175 "claim_type": "exclusive_write", 00:15:21.175 "zoned": false, 00:15:21.175 "supported_io_types": { 00:15:21.175 "read": true, 00:15:21.175 "write": true, 00:15:21.175 "unmap": true, 00:15:21.175 "flush": true, 00:15:21.175 "reset": true, 00:15:21.175 "nvme_admin": false, 00:15:21.175 "nvme_io": false, 00:15:21.175 "nvme_io_md": false, 00:15:21.175 "write_zeroes": true, 00:15:21.175 "zcopy": true, 00:15:21.175 "get_zone_info": false, 00:15:21.175 "zone_management": false, 00:15:21.175 "zone_append": false, 00:15:21.175 "compare": false, 00:15:21.175 "compare_and_write": false, 00:15:21.175 "abort": true, 00:15:21.175 "seek_hole": false, 00:15:21.175 "seek_data": false, 00:15:21.175 "copy": true, 00:15:21.175 "nvme_iov_md": false 00:15:21.175 }, 00:15:21.175 "memory_domains": [ 00:15:21.175 { 00:15:21.175 "dma_device_id": "system", 00:15:21.175 "dma_device_type": 1 00:15:21.175 }, 00:15:21.175 { 00:15:21.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.175 "dma_device_type": 2 00:15:21.175 } 00:15:21.175 ], 00:15:21.175 "driver_specific": {} 00:15:21.175 }' 00:15:21.175 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:21.175 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:21.434 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:21.434 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:21.434 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:21.434 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:21.434 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:21.434 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:21.434 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:21.434 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:21.434 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:21.693 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:21.693 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:21.693 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:21.693 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:21.952 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:21.952 "name": "BaseBdev3", 00:15:21.952 "aliases": [ 00:15:21.952 "2e3b25e6-e8e9-433a-9a20-344097c2ce21" 00:15:21.952 ], 00:15:21.952 "product_name": "Malloc disk", 00:15:21.952 "block_size": 512, 00:15:21.952 "num_blocks": 65536, 00:15:21.952 "uuid": "2e3b25e6-e8e9-433a-9a20-344097c2ce21", 00:15:21.952 "assigned_rate_limits": { 00:15:21.952 "rw_ios_per_sec": 0, 00:15:21.952 "rw_mbytes_per_sec": 0, 00:15:21.952 "r_mbytes_per_sec": 0, 00:15:21.952 "w_mbytes_per_sec": 0 00:15:21.952 }, 00:15:21.952 "claimed": true, 00:15:21.952 "claim_type": "exclusive_write", 00:15:21.952 "zoned": false, 00:15:21.952 "supported_io_types": { 00:15:21.952 "read": true, 00:15:21.952 "write": true, 00:15:21.952 "unmap": true, 00:15:21.952 "flush": true, 00:15:21.952 "reset": true, 00:15:21.952 "nvme_admin": false, 00:15:21.952 "nvme_io": false, 00:15:21.952 "nvme_io_md": false, 00:15:21.952 "write_zeroes": true, 00:15:21.952 "zcopy": true, 00:15:21.952 "get_zone_info": false, 00:15:21.952 "zone_management": false, 00:15:21.952 "zone_append": false, 00:15:21.952 "compare": false, 00:15:21.952 "compare_and_write": false, 00:15:21.952 "abort": true, 00:15:21.952 "seek_hole": false, 00:15:21.952 "seek_data": false, 00:15:21.952 "copy": true, 00:15:21.952 "nvme_iov_md": false 00:15:21.952 }, 00:15:21.952 "memory_domains": [ 00:15:21.952 { 00:15:21.952 "dma_device_id": "system", 00:15:21.952 "dma_device_type": 1 00:15:21.952 }, 00:15:21.952 { 00:15:21.952 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.952 "dma_device_type": 2 00:15:21.952 } 00:15:21.952 ], 00:15:21.952 "driver_specific": {} 00:15:21.952 }' 00:15:21.952 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:21.952 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:21.952 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:21.952 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:21.952 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:21.952 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:21.952 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:21.952 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:22.211 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:22.211 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.211 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.211 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:22.211 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:22.471 [2024-07-12 13:41:10.877072] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:22.471 [2024-07-12 13:41:10.877100] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:22.471 [2024-07-12 13:41:10.877157] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:22.471 [2024-07-12 13:41:10.877208] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:22.471 [2024-07-12 13:41:10.877220] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d25e90 name Existed_Raid, state offline 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 465973 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 465973 ']' 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 465973 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 465973 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 465973' 00:15:22.471 killing process with pid 465973 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 465973 00:15:22.471 [2024-07-12 13:41:10.945980] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:22.471 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 465973 00:15:22.471 [2024-07-12 13:41:10.977144] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:22.730 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:22.730 00:15:22.730 real 0m28.724s 00:15:22.730 user 0m53.090s 00:15:22.730 sys 0m5.300s 00:15:22.730 13:41:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:22.730 13:41:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.730 ************************************ 00:15:22.730 END TEST raid_state_function_test_sb 00:15:22.730 ************************************ 00:15:22.730 13:41:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:22.730 13:41:11 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:15:22.730 13:41:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:22.730 13:41:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:22.730 13:41:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:22.730 ************************************ 00:15:22.730 START TEST raid_superblock_test 00:15:22.730 ************************************ 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=470265 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 470265 /var/tmp/spdk-raid.sock 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 470265 ']' 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:22.730 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:22.730 13:41:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:22.990 [2024-07-12 13:41:11.362106] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:15:22.990 [2024-07-12 13:41:11.362173] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470265 ] 00:15:22.990 [2024-07-12 13:41:11.493344] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:23.247 [2024-07-12 13:41:11.599920] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:23.247 [2024-07-12 13:41:11.663046] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:23.247 [2024-07-12 13:41:11.663079] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:23.814 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:24.072 malloc1 00:15:24.072 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:24.331 [2024-07-12 13:41:12.773253] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:24.331 [2024-07-12 13:41:12.773299] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:24.331 [2024-07-12 13:41:12.773319] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2199e90 00:15:24.331 [2024-07-12 13:41:12.773332] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:24.331 [2024-07-12 13:41:12.775015] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:24.331 [2024-07-12 13:41:12.775044] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:24.331 pt1 00:15:24.331 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:24.331 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:24.331 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:24.331 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:24.331 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:24.331 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:24.331 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:24.331 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:24.331 13:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:24.598 malloc2 00:15:24.598 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:24.857 [2024-07-12 13:41:13.272529] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:24.857 [2024-07-12 13:41:13.272574] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:24.857 [2024-07-12 13:41:13.272591] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2237fb0 00:15:24.857 [2024-07-12 13:41:13.272609] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:24.857 [2024-07-12 13:41:13.274161] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:24.857 [2024-07-12 13:41:13.274189] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:24.857 pt2 00:15:24.857 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:24.857 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:24.857 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:24.857 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:24.858 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:24.858 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:24.858 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:24.858 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:24.858 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:25.116 malloc3 00:15:25.116 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:25.374 [2024-07-12 13:41:13.754388] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:25.374 [2024-07-12 13:41:13.754437] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:25.374 [2024-07-12 13:41:13.754454] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2238ce0 00:15:25.374 [2024-07-12 13:41:13.754467] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:25.374 [2024-07-12 13:41:13.756050] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:25.374 [2024-07-12 13:41:13.756079] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:25.374 pt3 00:15:25.374 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:25.374 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:25.374 13:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:25.634 [2024-07-12 13:41:13.995041] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:25.634 [2024-07-12 13:41:13.996393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:25.634 [2024-07-12 13:41:13.996447] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:25.634 [2024-07-12 13:41:13.996602] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x223b9a0 00:15:25.634 [2024-07-12 13:41:13.996613] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:25.634 [2024-07-12 13:41:13.996815] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x219ced0 00:15:25.634 [2024-07-12 13:41:13.996964] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x223b9a0 00:15:25.634 [2024-07-12 13:41:13.996975] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x223b9a0 00:15:25.634 [2024-07-12 13:41:13.997071] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.634 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:25.892 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.893 "name": "raid_bdev1", 00:15:25.893 "uuid": "c4f8cdf8-e11b-4f53-a148-89936b80d28c", 00:15:25.893 "strip_size_kb": 64, 00:15:25.893 "state": "online", 00:15:25.893 "raid_level": "concat", 00:15:25.893 "superblock": true, 00:15:25.893 "num_base_bdevs": 3, 00:15:25.893 "num_base_bdevs_discovered": 3, 00:15:25.893 "num_base_bdevs_operational": 3, 00:15:25.893 "base_bdevs_list": [ 00:15:25.893 { 00:15:25.893 "name": "pt1", 00:15:25.893 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:25.893 "is_configured": true, 00:15:25.893 "data_offset": 2048, 00:15:25.893 "data_size": 63488 00:15:25.893 }, 00:15:25.893 { 00:15:25.893 "name": "pt2", 00:15:25.893 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:25.893 "is_configured": true, 00:15:25.893 "data_offset": 2048, 00:15:25.893 "data_size": 63488 00:15:25.893 }, 00:15:25.893 { 00:15:25.893 "name": "pt3", 00:15:25.893 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:25.893 "is_configured": true, 00:15:25.893 "data_offset": 2048, 00:15:25.893 "data_size": 63488 00:15:25.893 } 00:15:25.893 ] 00:15:25.893 }' 00:15:25.893 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.893 13:41:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:26.459 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:26.459 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:26.459 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:26.459 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:26.459 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:26.459 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:26.459 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:26.459 13:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:26.718 [2024-07-12 13:41:15.074145] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:26.718 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:26.718 "name": "raid_bdev1", 00:15:26.718 "aliases": [ 00:15:26.718 "c4f8cdf8-e11b-4f53-a148-89936b80d28c" 00:15:26.718 ], 00:15:26.718 "product_name": "Raid Volume", 00:15:26.718 "block_size": 512, 00:15:26.718 "num_blocks": 190464, 00:15:26.718 "uuid": "c4f8cdf8-e11b-4f53-a148-89936b80d28c", 00:15:26.718 "assigned_rate_limits": { 00:15:26.718 "rw_ios_per_sec": 0, 00:15:26.718 "rw_mbytes_per_sec": 0, 00:15:26.718 "r_mbytes_per_sec": 0, 00:15:26.718 "w_mbytes_per_sec": 0 00:15:26.718 }, 00:15:26.718 "claimed": false, 00:15:26.718 "zoned": false, 00:15:26.718 "supported_io_types": { 00:15:26.718 "read": true, 00:15:26.718 "write": true, 00:15:26.718 "unmap": true, 00:15:26.718 "flush": true, 00:15:26.718 "reset": true, 00:15:26.718 "nvme_admin": false, 00:15:26.718 "nvme_io": false, 00:15:26.718 "nvme_io_md": false, 00:15:26.718 "write_zeroes": true, 00:15:26.718 "zcopy": false, 00:15:26.718 "get_zone_info": false, 00:15:26.718 "zone_management": false, 00:15:26.718 "zone_append": false, 00:15:26.718 "compare": false, 00:15:26.718 "compare_and_write": false, 00:15:26.718 "abort": false, 00:15:26.718 "seek_hole": false, 00:15:26.718 "seek_data": false, 00:15:26.718 "copy": false, 00:15:26.718 "nvme_iov_md": false 00:15:26.718 }, 00:15:26.718 "memory_domains": [ 00:15:26.718 { 00:15:26.718 "dma_device_id": "system", 00:15:26.718 "dma_device_type": 1 00:15:26.718 }, 00:15:26.718 { 00:15:26.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.718 "dma_device_type": 2 00:15:26.718 }, 00:15:26.718 { 00:15:26.718 "dma_device_id": "system", 00:15:26.718 "dma_device_type": 1 00:15:26.718 }, 00:15:26.718 { 00:15:26.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.718 "dma_device_type": 2 00:15:26.718 }, 00:15:26.718 { 00:15:26.718 "dma_device_id": "system", 00:15:26.718 "dma_device_type": 1 00:15:26.718 }, 00:15:26.718 { 00:15:26.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.718 "dma_device_type": 2 00:15:26.718 } 00:15:26.718 ], 00:15:26.718 "driver_specific": { 00:15:26.718 "raid": { 00:15:26.718 "uuid": "c4f8cdf8-e11b-4f53-a148-89936b80d28c", 00:15:26.718 "strip_size_kb": 64, 00:15:26.718 "state": "online", 00:15:26.718 "raid_level": "concat", 00:15:26.718 "superblock": true, 00:15:26.718 "num_base_bdevs": 3, 00:15:26.718 "num_base_bdevs_discovered": 3, 00:15:26.718 "num_base_bdevs_operational": 3, 00:15:26.718 "base_bdevs_list": [ 00:15:26.718 { 00:15:26.718 "name": "pt1", 00:15:26.718 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:26.718 "is_configured": true, 00:15:26.718 "data_offset": 2048, 00:15:26.718 "data_size": 63488 00:15:26.718 }, 00:15:26.718 { 00:15:26.718 "name": "pt2", 00:15:26.718 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:26.718 "is_configured": true, 00:15:26.718 "data_offset": 2048, 00:15:26.718 "data_size": 63488 00:15:26.718 }, 00:15:26.718 { 00:15:26.718 "name": "pt3", 00:15:26.718 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:26.718 "is_configured": true, 00:15:26.718 "data_offset": 2048, 00:15:26.718 "data_size": 63488 00:15:26.718 } 00:15:26.718 ] 00:15:26.718 } 00:15:26.718 } 00:15:26.718 }' 00:15:26.718 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:26.718 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:26.718 pt2 00:15:26.718 pt3' 00:15:26.718 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.718 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.719 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:26.977 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.977 "name": "pt1", 00:15:26.977 "aliases": [ 00:15:26.977 "00000000-0000-0000-0000-000000000001" 00:15:26.977 ], 00:15:26.977 "product_name": "passthru", 00:15:26.977 "block_size": 512, 00:15:26.977 "num_blocks": 65536, 00:15:26.977 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:26.977 "assigned_rate_limits": { 00:15:26.977 "rw_ios_per_sec": 0, 00:15:26.977 "rw_mbytes_per_sec": 0, 00:15:26.977 "r_mbytes_per_sec": 0, 00:15:26.977 "w_mbytes_per_sec": 0 00:15:26.977 }, 00:15:26.977 "claimed": true, 00:15:26.977 "claim_type": "exclusive_write", 00:15:26.977 "zoned": false, 00:15:26.977 "supported_io_types": { 00:15:26.977 "read": true, 00:15:26.977 "write": true, 00:15:26.977 "unmap": true, 00:15:26.977 "flush": true, 00:15:26.977 "reset": true, 00:15:26.977 "nvme_admin": false, 00:15:26.977 "nvme_io": false, 00:15:26.977 "nvme_io_md": false, 00:15:26.977 "write_zeroes": true, 00:15:26.977 "zcopy": true, 00:15:26.977 "get_zone_info": false, 00:15:26.977 "zone_management": false, 00:15:26.977 "zone_append": false, 00:15:26.977 "compare": false, 00:15:26.977 "compare_and_write": false, 00:15:26.977 "abort": true, 00:15:26.977 "seek_hole": false, 00:15:26.977 "seek_data": false, 00:15:26.977 "copy": true, 00:15:26.977 "nvme_iov_md": false 00:15:26.977 }, 00:15:26.977 "memory_domains": [ 00:15:26.977 { 00:15:26.977 "dma_device_id": "system", 00:15:26.977 "dma_device_type": 1 00:15:26.977 }, 00:15:26.977 { 00:15:26.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.977 "dma_device_type": 2 00:15:26.977 } 00:15:26.977 ], 00:15:26.977 "driver_specific": { 00:15:26.977 "passthru": { 00:15:26.977 "name": "pt1", 00:15:26.977 "base_bdev_name": "malloc1" 00:15:26.977 } 00:15:26.977 } 00:15:26.977 }' 00:15:26.977 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.977 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.977 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.977 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.977 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.235 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.235 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.235 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.235 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.235 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.235 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.235 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.235 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:27.235 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:27.235 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:27.494 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:27.494 "name": "pt2", 00:15:27.494 "aliases": [ 00:15:27.494 "00000000-0000-0000-0000-000000000002" 00:15:27.494 ], 00:15:27.494 "product_name": "passthru", 00:15:27.494 "block_size": 512, 00:15:27.494 "num_blocks": 65536, 00:15:27.494 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:27.494 "assigned_rate_limits": { 00:15:27.494 "rw_ios_per_sec": 0, 00:15:27.494 "rw_mbytes_per_sec": 0, 00:15:27.494 "r_mbytes_per_sec": 0, 00:15:27.494 "w_mbytes_per_sec": 0 00:15:27.494 }, 00:15:27.494 "claimed": true, 00:15:27.494 "claim_type": "exclusive_write", 00:15:27.494 "zoned": false, 00:15:27.494 "supported_io_types": { 00:15:27.494 "read": true, 00:15:27.494 "write": true, 00:15:27.494 "unmap": true, 00:15:27.494 "flush": true, 00:15:27.494 "reset": true, 00:15:27.494 "nvme_admin": false, 00:15:27.494 "nvme_io": false, 00:15:27.494 "nvme_io_md": false, 00:15:27.494 "write_zeroes": true, 00:15:27.494 "zcopy": true, 00:15:27.494 "get_zone_info": false, 00:15:27.494 "zone_management": false, 00:15:27.494 "zone_append": false, 00:15:27.494 "compare": false, 00:15:27.494 "compare_and_write": false, 00:15:27.494 "abort": true, 00:15:27.494 "seek_hole": false, 00:15:27.494 "seek_data": false, 00:15:27.494 "copy": true, 00:15:27.494 "nvme_iov_md": false 00:15:27.494 }, 00:15:27.494 "memory_domains": [ 00:15:27.494 { 00:15:27.494 "dma_device_id": "system", 00:15:27.494 "dma_device_type": 1 00:15:27.494 }, 00:15:27.494 { 00:15:27.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.494 "dma_device_type": 2 00:15:27.494 } 00:15:27.494 ], 00:15:27.494 "driver_specific": { 00:15:27.494 "passthru": { 00:15:27.494 "name": "pt2", 00:15:27.494 "base_bdev_name": "malloc2" 00:15:27.494 } 00:15:27.494 } 00:15:27.494 }' 00:15:27.494 13:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.494 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.494 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:27.494 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.753 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.753 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.753 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.753 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.753 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.753 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.753 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.753 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.753 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:27.753 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:28.011 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:28.011 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:28.011 "name": "pt3", 00:15:28.011 "aliases": [ 00:15:28.011 "00000000-0000-0000-0000-000000000003" 00:15:28.011 ], 00:15:28.011 "product_name": "passthru", 00:15:28.011 "block_size": 512, 00:15:28.011 "num_blocks": 65536, 00:15:28.011 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:28.011 "assigned_rate_limits": { 00:15:28.011 "rw_ios_per_sec": 0, 00:15:28.011 "rw_mbytes_per_sec": 0, 00:15:28.011 "r_mbytes_per_sec": 0, 00:15:28.011 "w_mbytes_per_sec": 0 00:15:28.011 }, 00:15:28.011 "claimed": true, 00:15:28.011 "claim_type": "exclusive_write", 00:15:28.011 "zoned": false, 00:15:28.011 "supported_io_types": { 00:15:28.011 "read": true, 00:15:28.011 "write": true, 00:15:28.011 "unmap": true, 00:15:28.011 "flush": true, 00:15:28.011 "reset": true, 00:15:28.011 "nvme_admin": false, 00:15:28.011 "nvme_io": false, 00:15:28.011 "nvme_io_md": false, 00:15:28.011 "write_zeroes": true, 00:15:28.011 "zcopy": true, 00:15:28.011 "get_zone_info": false, 00:15:28.011 "zone_management": false, 00:15:28.011 "zone_append": false, 00:15:28.011 "compare": false, 00:15:28.011 "compare_and_write": false, 00:15:28.011 "abort": true, 00:15:28.011 "seek_hole": false, 00:15:28.011 "seek_data": false, 00:15:28.011 "copy": true, 00:15:28.011 "nvme_iov_md": false 00:15:28.011 }, 00:15:28.011 "memory_domains": [ 00:15:28.011 { 00:15:28.011 "dma_device_id": "system", 00:15:28.011 "dma_device_type": 1 00:15:28.011 }, 00:15:28.011 { 00:15:28.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.011 "dma_device_type": 2 00:15:28.011 } 00:15:28.011 ], 00:15:28.011 "driver_specific": { 00:15:28.011 "passthru": { 00:15:28.011 "name": "pt3", 00:15:28.011 "base_bdev_name": "malloc3" 00:15:28.011 } 00:15:28.011 } 00:15:28.011 }' 00:15:28.011 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.269 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.269 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:28.269 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.269 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.269 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:28.270 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.270 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.270 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:28.270 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.528 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.528 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:28.528 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:28.528 13:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:28.786 [2024-07-12 13:41:17.147635] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:28.786 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c4f8cdf8-e11b-4f53-a148-89936b80d28c 00:15:28.786 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c4f8cdf8-e11b-4f53-a148-89936b80d28c ']' 00:15:28.786 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:29.045 [2024-07-12 13:41:17.396013] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:29.045 [2024-07-12 13:41:17.396036] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:29.045 [2024-07-12 13:41:17.396085] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:29.045 [2024-07-12 13:41:17.396138] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:29.045 [2024-07-12 13:41:17.396149] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x223b9a0 name raid_bdev1, state offline 00:15:29.045 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.045 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:29.304 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:29.304 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:29.304 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:29.304 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:29.562 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:29.562 13:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:29.562 13:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:29.562 13:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:29.820 13:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:29.820 13:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:30.079 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:30.337 [2024-07-12 13:41:18.855813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:30.338 [2024-07-12 13:41:18.857220] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:30.338 [2024-07-12 13:41:18.857262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:30.338 [2024-07-12 13:41:18.857308] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:30.338 [2024-07-12 13:41:18.857348] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:30.338 [2024-07-12 13:41:18.857371] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:30.338 [2024-07-12 13:41:18.857388] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:30.338 [2024-07-12 13:41:18.857398] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x219d040 name raid_bdev1, state configuring 00:15:30.338 request: 00:15:30.338 { 00:15:30.338 "name": "raid_bdev1", 00:15:30.338 "raid_level": "concat", 00:15:30.338 "base_bdevs": [ 00:15:30.338 "malloc1", 00:15:30.338 "malloc2", 00:15:30.338 "malloc3" 00:15:30.338 ], 00:15:30.338 "strip_size_kb": 64, 00:15:30.338 "superblock": false, 00:15:30.338 "method": "bdev_raid_create", 00:15:30.338 "req_id": 1 00:15:30.338 } 00:15:30.338 Got JSON-RPC error response 00:15:30.338 response: 00:15:30.338 { 00:15:30.338 "code": -17, 00:15:30.338 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:30.338 } 00:15:30.338 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:30.338 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:30.338 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:30.338 13:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:30.338 13:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.338 13:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:30.595 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:30.595 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:30.595 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:30.853 [2024-07-12 13:41:19.349049] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:30.853 [2024-07-12 13:41:19.349090] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:30.854 [2024-07-12 13:41:19.349108] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2239de0 00:15:30.854 [2024-07-12 13:41:19.349121] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:30.854 [2024-07-12 13:41:19.350696] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:30.854 [2024-07-12 13:41:19.350727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:30.854 [2024-07-12 13:41:19.350790] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:30.854 [2024-07-12 13:41:19.350814] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:30.854 pt1 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.854 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:31.112 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.112 "name": "raid_bdev1", 00:15:31.112 "uuid": "c4f8cdf8-e11b-4f53-a148-89936b80d28c", 00:15:31.112 "strip_size_kb": 64, 00:15:31.112 "state": "configuring", 00:15:31.112 "raid_level": "concat", 00:15:31.112 "superblock": true, 00:15:31.112 "num_base_bdevs": 3, 00:15:31.112 "num_base_bdevs_discovered": 1, 00:15:31.112 "num_base_bdevs_operational": 3, 00:15:31.112 "base_bdevs_list": [ 00:15:31.112 { 00:15:31.112 "name": "pt1", 00:15:31.112 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:31.112 "is_configured": true, 00:15:31.112 "data_offset": 2048, 00:15:31.112 "data_size": 63488 00:15:31.112 }, 00:15:31.112 { 00:15:31.112 "name": null, 00:15:31.112 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:31.112 "is_configured": false, 00:15:31.112 "data_offset": 2048, 00:15:31.112 "data_size": 63488 00:15:31.112 }, 00:15:31.112 { 00:15:31.112 "name": null, 00:15:31.112 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:31.112 "is_configured": false, 00:15:31.112 "data_offset": 2048, 00:15:31.112 "data_size": 63488 00:15:31.112 } 00:15:31.112 ] 00:15:31.112 }' 00:15:31.112 13:41:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.112 13:41:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.679 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:31.679 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:31.937 [2024-07-12 13:41:20.440051] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:31.937 [2024-07-12 13:41:20.440103] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:31.937 [2024-07-12 13:41:20.440124] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x219a0c0 00:15:31.937 [2024-07-12 13:41:20.440136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:31.937 [2024-07-12 13:41:20.440474] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:31.937 [2024-07-12 13:41:20.440491] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:31.937 [2024-07-12 13:41:20.440552] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:31.937 [2024-07-12 13:41:20.440569] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:31.937 pt2 00:15:31.937 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:32.196 [2024-07-12 13:41:20.688715] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.196 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:32.455 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.455 "name": "raid_bdev1", 00:15:32.455 "uuid": "c4f8cdf8-e11b-4f53-a148-89936b80d28c", 00:15:32.455 "strip_size_kb": 64, 00:15:32.455 "state": "configuring", 00:15:32.455 "raid_level": "concat", 00:15:32.455 "superblock": true, 00:15:32.455 "num_base_bdevs": 3, 00:15:32.455 "num_base_bdevs_discovered": 1, 00:15:32.455 "num_base_bdevs_operational": 3, 00:15:32.455 "base_bdevs_list": [ 00:15:32.455 { 00:15:32.455 "name": "pt1", 00:15:32.455 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:32.455 "is_configured": true, 00:15:32.455 "data_offset": 2048, 00:15:32.455 "data_size": 63488 00:15:32.455 }, 00:15:32.455 { 00:15:32.455 "name": null, 00:15:32.455 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:32.455 "is_configured": false, 00:15:32.455 "data_offset": 2048, 00:15:32.455 "data_size": 63488 00:15:32.455 }, 00:15:32.455 { 00:15:32.455 "name": null, 00:15:32.455 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:32.455 "is_configured": false, 00:15:32.455 "data_offset": 2048, 00:15:32.455 "data_size": 63488 00:15:32.455 } 00:15:32.455 ] 00:15:32.455 }' 00:15:32.455 13:41:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.455 13:41:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.022 13:41:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:33.022 13:41:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:33.022 13:41:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:33.280 [2024-07-12 13:41:21.775579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:33.280 [2024-07-12 13:41:21.775626] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:33.280 [2024-07-12 13:41:21.775650] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2198b10 00:15:33.280 [2024-07-12 13:41:21.775662] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:33.280 [2024-07-12 13:41:21.776004] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:33.280 [2024-07-12 13:41:21.776021] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:33.280 [2024-07-12 13:41:21.776081] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:33.280 [2024-07-12 13:41:21.776098] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:33.280 pt2 00:15:33.280 13:41:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:33.280 13:41:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:33.280 13:41:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:33.537 [2024-07-12 13:41:22.024233] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:33.537 [2024-07-12 13:41:22.024263] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:33.537 [2024-07-12 13:41:22.024280] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22396f0 00:15:33.537 [2024-07-12 13:41:22.024291] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:33.537 [2024-07-12 13:41:22.024565] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:33.537 [2024-07-12 13:41:22.024582] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:33.537 [2024-07-12 13:41:22.024630] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:33.537 [2024-07-12 13:41:22.024647] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:33.537 [2024-07-12 13:41:22.024747] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x223c8c0 00:15:33.537 [2024-07-12 13:41:22.024758] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:33.537 [2024-07-12 13:41:22.024919] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2239980 00:15:33.537 [2024-07-12 13:41:22.025048] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x223c8c0 00:15:33.537 [2024-07-12 13:41:22.025058] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x223c8c0 00:15:33.537 [2024-07-12 13:41:22.025151] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:33.537 pt3 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.537 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:33.794 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.794 "name": "raid_bdev1", 00:15:33.794 "uuid": "c4f8cdf8-e11b-4f53-a148-89936b80d28c", 00:15:33.794 "strip_size_kb": 64, 00:15:33.794 "state": "online", 00:15:33.794 "raid_level": "concat", 00:15:33.794 "superblock": true, 00:15:33.794 "num_base_bdevs": 3, 00:15:33.794 "num_base_bdevs_discovered": 3, 00:15:33.794 "num_base_bdevs_operational": 3, 00:15:33.794 "base_bdevs_list": [ 00:15:33.794 { 00:15:33.794 "name": "pt1", 00:15:33.794 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:33.794 "is_configured": true, 00:15:33.794 "data_offset": 2048, 00:15:33.794 "data_size": 63488 00:15:33.794 }, 00:15:33.794 { 00:15:33.794 "name": "pt2", 00:15:33.794 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:33.794 "is_configured": true, 00:15:33.794 "data_offset": 2048, 00:15:33.794 "data_size": 63488 00:15:33.794 }, 00:15:33.794 { 00:15:33.794 "name": "pt3", 00:15:33.794 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:33.794 "is_configured": true, 00:15:33.794 "data_offset": 2048, 00:15:33.794 "data_size": 63488 00:15:33.794 } 00:15:33.794 ] 00:15:33.794 }' 00:15:33.794 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.794 13:41:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:34.359 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:34.359 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:34.359 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:34.359 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:34.359 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:34.359 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:34.359 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:34.359 13:41:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:34.617 [2024-07-12 13:41:23.055262] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:34.617 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:34.617 "name": "raid_bdev1", 00:15:34.617 "aliases": [ 00:15:34.617 "c4f8cdf8-e11b-4f53-a148-89936b80d28c" 00:15:34.617 ], 00:15:34.617 "product_name": "Raid Volume", 00:15:34.617 "block_size": 512, 00:15:34.617 "num_blocks": 190464, 00:15:34.617 "uuid": "c4f8cdf8-e11b-4f53-a148-89936b80d28c", 00:15:34.617 "assigned_rate_limits": { 00:15:34.617 "rw_ios_per_sec": 0, 00:15:34.617 "rw_mbytes_per_sec": 0, 00:15:34.617 "r_mbytes_per_sec": 0, 00:15:34.617 "w_mbytes_per_sec": 0 00:15:34.617 }, 00:15:34.617 "claimed": false, 00:15:34.617 "zoned": false, 00:15:34.617 "supported_io_types": { 00:15:34.617 "read": true, 00:15:34.617 "write": true, 00:15:34.617 "unmap": true, 00:15:34.617 "flush": true, 00:15:34.617 "reset": true, 00:15:34.617 "nvme_admin": false, 00:15:34.617 "nvme_io": false, 00:15:34.617 "nvme_io_md": false, 00:15:34.617 "write_zeroes": true, 00:15:34.617 "zcopy": false, 00:15:34.617 "get_zone_info": false, 00:15:34.617 "zone_management": false, 00:15:34.617 "zone_append": false, 00:15:34.617 "compare": false, 00:15:34.617 "compare_and_write": false, 00:15:34.617 "abort": false, 00:15:34.617 "seek_hole": false, 00:15:34.617 "seek_data": false, 00:15:34.617 "copy": false, 00:15:34.617 "nvme_iov_md": false 00:15:34.617 }, 00:15:34.617 "memory_domains": [ 00:15:34.617 { 00:15:34.617 "dma_device_id": "system", 00:15:34.617 "dma_device_type": 1 00:15:34.617 }, 00:15:34.617 { 00:15:34.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.617 "dma_device_type": 2 00:15:34.617 }, 00:15:34.617 { 00:15:34.617 "dma_device_id": "system", 00:15:34.617 "dma_device_type": 1 00:15:34.617 }, 00:15:34.617 { 00:15:34.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.617 "dma_device_type": 2 00:15:34.617 }, 00:15:34.617 { 00:15:34.617 "dma_device_id": "system", 00:15:34.617 "dma_device_type": 1 00:15:34.617 }, 00:15:34.617 { 00:15:34.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.617 "dma_device_type": 2 00:15:34.617 } 00:15:34.617 ], 00:15:34.617 "driver_specific": { 00:15:34.617 "raid": { 00:15:34.618 "uuid": "c4f8cdf8-e11b-4f53-a148-89936b80d28c", 00:15:34.618 "strip_size_kb": 64, 00:15:34.618 "state": "online", 00:15:34.618 "raid_level": "concat", 00:15:34.618 "superblock": true, 00:15:34.618 "num_base_bdevs": 3, 00:15:34.618 "num_base_bdevs_discovered": 3, 00:15:34.618 "num_base_bdevs_operational": 3, 00:15:34.618 "base_bdevs_list": [ 00:15:34.618 { 00:15:34.618 "name": "pt1", 00:15:34.618 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:34.618 "is_configured": true, 00:15:34.618 "data_offset": 2048, 00:15:34.618 "data_size": 63488 00:15:34.618 }, 00:15:34.618 { 00:15:34.618 "name": "pt2", 00:15:34.618 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:34.618 "is_configured": true, 00:15:34.618 "data_offset": 2048, 00:15:34.618 "data_size": 63488 00:15:34.618 }, 00:15:34.618 { 00:15:34.618 "name": "pt3", 00:15:34.618 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:34.618 "is_configured": true, 00:15:34.618 "data_offset": 2048, 00:15:34.618 "data_size": 63488 00:15:34.618 } 00:15:34.618 ] 00:15:34.618 } 00:15:34.618 } 00:15:34.618 }' 00:15:34.618 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:34.618 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:34.618 pt2 00:15:34.618 pt3' 00:15:34.618 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:34.618 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:34.618 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:34.876 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:34.876 "name": "pt1", 00:15:34.876 "aliases": [ 00:15:34.876 "00000000-0000-0000-0000-000000000001" 00:15:34.876 ], 00:15:34.876 "product_name": "passthru", 00:15:34.876 "block_size": 512, 00:15:34.876 "num_blocks": 65536, 00:15:34.876 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:34.876 "assigned_rate_limits": { 00:15:34.876 "rw_ios_per_sec": 0, 00:15:34.876 "rw_mbytes_per_sec": 0, 00:15:34.876 "r_mbytes_per_sec": 0, 00:15:34.876 "w_mbytes_per_sec": 0 00:15:34.876 }, 00:15:34.876 "claimed": true, 00:15:34.876 "claim_type": "exclusive_write", 00:15:34.876 "zoned": false, 00:15:34.876 "supported_io_types": { 00:15:34.876 "read": true, 00:15:34.876 "write": true, 00:15:34.876 "unmap": true, 00:15:34.876 "flush": true, 00:15:34.876 "reset": true, 00:15:34.876 "nvme_admin": false, 00:15:34.876 "nvme_io": false, 00:15:34.876 "nvme_io_md": false, 00:15:34.876 "write_zeroes": true, 00:15:34.876 "zcopy": true, 00:15:34.876 "get_zone_info": false, 00:15:34.876 "zone_management": false, 00:15:34.876 "zone_append": false, 00:15:34.876 "compare": false, 00:15:34.876 "compare_and_write": false, 00:15:34.876 "abort": true, 00:15:34.876 "seek_hole": false, 00:15:34.876 "seek_data": false, 00:15:34.876 "copy": true, 00:15:34.876 "nvme_iov_md": false 00:15:34.876 }, 00:15:34.876 "memory_domains": [ 00:15:34.876 { 00:15:34.876 "dma_device_id": "system", 00:15:34.876 "dma_device_type": 1 00:15:34.876 }, 00:15:34.876 { 00:15:34.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.876 "dma_device_type": 2 00:15:34.876 } 00:15:34.876 ], 00:15:34.876 "driver_specific": { 00:15:34.876 "passthru": { 00:15:34.876 "name": "pt1", 00:15:34.876 "base_bdev_name": "malloc1" 00:15:34.876 } 00:15:34.876 } 00:15:34.876 }' 00:15:34.876 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.876 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.133 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:35.133 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.133 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.133 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:35.133 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.133 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.133 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:35.133 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.133 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.391 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:35.391 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:35.391 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:35.391 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:35.650 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:35.650 "name": "pt2", 00:15:35.650 "aliases": [ 00:15:35.650 "00000000-0000-0000-0000-000000000002" 00:15:35.650 ], 00:15:35.650 "product_name": "passthru", 00:15:35.650 "block_size": 512, 00:15:35.650 "num_blocks": 65536, 00:15:35.650 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:35.650 "assigned_rate_limits": { 00:15:35.650 "rw_ios_per_sec": 0, 00:15:35.650 "rw_mbytes_per_sec": 0, 00:15:35.650 "r_mbytes_per_sec": 0, 00:15:35.650 "w_mbytes_per_sec": 0 00:15:35.650 }, 00:15:35.650 "claimed": true, 00:15:35.650 "claim_type": "exclusive_write", 00:15:35.650 "zoned": false, 00:15:35.650 "supported_io_types": { 00:15:35.650 "read": true, 00:15:35.650 "write": true, 00:15:35.650 "unmap": true, 00:15:35.650 "flush": true, 00:15:35.650 "reset": true, 00:15:35.650 "nvme_admin": false, 00:15:35.650 "nvme_io": false, 00:15:35.650 "nvme_io_md": false, 00:15:35.650 "write_zeroes": true, 00:15:35.650 "zcopy": true, 00:15:35.650 "get_zone_info": false, 00:15:35.650 "zone_management": false, 00:15:35.650 "zone_append": false, 00:15:35.650 "compare": false, 00:15:35.650 "compare_and_write": false, 00:15:35.650 "abort": true, 00:15:35.650 "seek_hole": false, 00:15:35.650 "seek_data": false, 00:15:35.650 "copy": true, 00:15:35.650 "nvme_iov_md": false 00:15:35.650 }, 00:15:35.650 "memory_domains": [ 00:15:35.650 { 00:15:35.650 "dma_device_id": "system", 00:15:35.650 "dma_device_type": 1 00:15:35.650 }, 00:15:35.650 { 00:15:35.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.650 "dma_device_type": 2 00:15:35.650 } 00:15:35.650 ], 00:15:35.650 "driver_specific": { 00:15:35.650 "passthru": { 00:15:35.650 "name": "pt2", 00:15:35.650 "base_bdev_name": "malloc2" 00:15:35.650 } 00:15:35.650 } 00:15:35.650 }' 00:15:35.650 13:41:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.650 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.650 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:35.650 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.650 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.650 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:35.650 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.650 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:35.908 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:35.908 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.908 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:35.908 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:35.908 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:35.908 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:35.908 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:36.167 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:36.167 "name": "pt3", 00:15:36.167 "aliases": [ 00:15:36.167 "00000000-0000-0000-0000-000000000003" 00:15:36.167 ], 00:15:36.167 "product_name": "passthru", 00:15:36.167 "block_size": 512, 00:15:36.167 "num_blocks": 65536, 00:15:36.167 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:36.167 "assigned_rate_limits": { 00:15:36.167 "rw_ios_per_sec": 0, 00:15:36.167 "rw_mbytes_per_sec": 0, 00:15:36.167 "r_mbytes_per_sec": 0, 00:15:36.167 "w_mbytes_per_sec": 0 00:15:36.167 }, 00:15:36.167 "claimed": true, 00:15:36.167 "claim_type": "exclusive_write", 00:15:36.167 "zoned": false, 00:15:36.167 "supported_io_types": { 00:15:36.167 "read": true, 00:15:36.167 "write": true, 00:15:36.167 "unmap": true, 00:15:36.167 "flush": true, 00:15:36.167 "reset": true, 00:15:36.167 "nvme_admin": false, 00:15:36.167 "nvme_io": false, 00:15:36.167 "nvme_io_md": false, 00:15:36.167 "write_zeroes": true, 00:15:36.167 "zcopy": true, 00:15:36.167 "get_zone_info": false, 00:15:36.167 "zone_management": false, 00:15:36.167 "zone_append": false, 00:15:36.167 "compare": false, 00:15:36.167 "compare_and_write": false, 00:15:36.167 "abort": true, 00:15:36.167 "seek_hole": false, 00:15:36.167 "seek_data": false, 00:15:36.167 "copy": true, 00:15:36.167 "nvme_iov_md": false 00:15:36.167 }, 00:15:36.167 "memory_domains": [ 00:15:36.167 { 00:15:36.167 "dma_device_id": "system", 00:15:36.167 "dma_device_type": 1 00:15:36.167 }, 00:15:36.167 { 00:15:36.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.167 "dma_device_type": 2 00:15:36.167 } 00:15:36.167 ], 00:15:36.167 "driver_specific": { 00:15:36.167 "passthru": { 00:15:36.167 "name": "pt3", 00:15:36.167 "base_bdev_name": "malloc3" 00:15:36.167 } 00:15:36.167 } 00:15:36.167 }' 00:15:36.167 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:36.167 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:36.167 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:36.167 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:36.167 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:36.426 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:36.426 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:36.426 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:36.426 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:36.426 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:36.426 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:36.426 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:36.426 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:36.426 13:41:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:36.684 [2024-07-12 13:41:25.164975] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c4f8cdf8-e11b-4f53-a148-89936b80d28c '!=' c4f8cdf8-e11b-4f53-a148-89936b80d28c ']' 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 470265 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 470265 ']' 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 470265 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 470265 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 470265' 00:15:36.684 killing process with pid 470265 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 470265 00:15:36.684 [2024-07-12 13:41:25.237979] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:36.684 [2024-07-12 13:41:25.238035] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:36.684 [2024-07-12 13:41:25.238086] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:36.684 [2024-07-12 13:41:25.238098] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x223c8c0 name raid_bdev1, state offline 00:15:36.684 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 470265 00:15:36.943 [2024-07-12 13:41:25.269413] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:36.943 13:41:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:36.943 00:15:36.943 real 0m14.186s 00:15:36.943 user 0m25.428s 00:15:36.943 sys 0m2.656s 00:15:36.943 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:36.943 13:41:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.943 ************************************ 00:15:36.943 END TEST raid_superblock_test 00:15:36.943 ************************************ 00:15:37.202 13:41:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:37.202 13:41:25 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:15:37.202 13:41:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:37.202 13:41:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:37.202 13:41:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:37.202 ************************************ 00:15:37.202 START TEST raid_read_error_test 00:15:37.202 ************************************ 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.tMXV5LMRQA 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=472484 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 472484 /var/tmp/spdk-raid.sock 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 472484 ']' 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:37.202 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:37.202 13:41:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.202 [2024-07-12 13:41:25.642741] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:15:37.202 [2024-07-12 13:41:25.642805] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid472484 ] 00:15:37.202 [2024-07-12 13:41:25.770656] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:37.461 [2024-07-12 13:41:25.874636] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:37.461 [2024-07-12 13:41:25.942640] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:37.461 [2024-07-12 13:41:25.942671] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:38.033 13:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:38.033 13:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:38.033 13:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:38.033 13:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:38.601 BaseBdev1_malloc 00:15:38.601 13:41:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:38.860 true 00:15:38.860 13:41:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:39.156 [2024-07-12 13:41:27.558251] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:39.156 [2024-07-12 13:41:27.558297] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:39.156 [2024-07-12 13:41:27.558318] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe4a10 00:15:39.156 [2024-07-12 13:41:27.558331] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:39.156 [2024-07-12 13:41:27.560237] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:39.156 [2024-07-12 13:41:27.560267] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:39.156 BaseBdev1 00:15:39.156 13:41:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:39.156 13:41:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:39.795 BaseBdev2_malloc 00:15:39.795 13:41:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:40.120 true 00:15:40.120 13:41:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:40.406 [2024-07-12 13:41:28.839466] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:40.406 [2024-07-12 13:41:28.839511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:40.406 [2024-07-12 13:41:28.839532] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe9250 00:15:40.406 [2024-07-12 13:41:28.839545] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:40.406 [2024-07-12 13:41:28.841041] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:40.406 [2024-07-12 13:41:28.841067] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:40.406 BaseBdev2 00:15:40.406 13:41:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:40.406 13:41:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:40.665 BaseBdev3_malloc 00:15:40.665 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:40.925 true 00:15:40.925 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:41.184 [2024-07-12 13:41:29.614045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:41.184 [2024-07-12 13:41:29.614096] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:41.185 [2024-07-12 13:41:29.614117] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfeb510 00:15:41.185 [2024-07-12 13:41:29.614130] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:41.185 [2024-07-12 13:41:29.615736] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:41.185 [2024-07-12 13:41:29.615763] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:41.185 BaseBdev3 00:15:41.185 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:41.444 [2024-07-12 13:41:29.866737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:41.444 [2024-07-12 13:41:29.868001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:41.444 [2024-07-12 13:41:29.868069] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:41.444 [2024-07-12 13:41:29.868271] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfecbc0 00:15:41.444 [2024-07-12 13:41:29.868283] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:41.444 [2024-07-12 13:41:29.868472] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfec760 00:15:41.444 [2024-07-12 13:41:29.868615] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfecbc0 00:15:41.444 [2024-07-12 13:41:29.868625] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfecbc0 00:15:41.444 [2024-07-12 13:41:29.868725] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:41.444 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:41.444 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:41.444 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:41.444 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:41.444 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:41.444 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:41.444 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.444 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.444 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.445 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.445 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.445 13:41:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:41.703 13:41:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.703 "name": "raid_bdev1", 00:15:41.703 "uuid": "92d4724a-c740-4605-a1f0-1952f0b5c91b", 00:15:41.703 "strip_size_kb": 64, 00:15:41.703 "state": "online", 00:15:41.703 "raid_level": "concat", 00:15:41.703 "superblock": true, 00:15:41.703 "num_base_bdevs": 3, 00:15:41.703 "num_base_bdevs_discovered": 3, 00:15:41.703 "num_base_bdevs_operational": 3, 00:15:41.703 "base_bdevs_list": [ 00:15:41.703 { 00:15:41.703 "name": "BaseBdev1", 00:15:41.703 "uuid": "1de10ac0-53e1-5231-a840-d5db2413718d", 00:15:41.703 "is_configured": true, 00:15:41.703 "data_offset": 2048, 00:15:41.703 "data_size": 63488 00:15:41.703 }, 00:15:41.703 { 00:15:41.703 "name": "BaseBdev2", 00:15:41.703 "uuid": "3a42c39e-8c8e-56a4-a976-c47cb61bf0d2", 00:15:41.703 "is_configured": true, 00:15:41.703 "data_offset": 2048, 00:15:41.703 "data_size": 63488 00:15:41.703 }, 00:15:41.703 { 00:15:41.703 "name": "BaseBdev3", 00:15:41.703 "uuid": "73d3c329-d309-569f-bd4a-4c18661b6900", 00:15:41.703 "is_configured": true, 00:15:41.703 "data_offset": 2048, 00:15:41.703 "data_size": 63488 00:15:41.703 } 00:15:41.703 ] 00:15:41.703 }' 00:15:41.703 13:41:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.703 13:41:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.271 13:41:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:42.271 13:41:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:42.531 [2024-07-12 13:41:30.857636] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe3ae10 00:15:43.467 13:41:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:43.467 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:43.467 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:43.467 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:43.467 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:43.467 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:43.467 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:43.467 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:43.468 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:43.468 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:43.468 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:43.468 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:43.468 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:43.468 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:43.468 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.468 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:43.727 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:43.727 "name": "raid_bdev1", 00:15:43.727 "uuid": "92d4724a-c740-4605-a1f0-1952f0b5c91b", 00:15:43.727 "strip_size_kb": 64, 00:15:43.727 "state": "online", 00:15:43.727 "raid_level": "concat", 00:15:43.727 "superblock": true, 00:15:43.727 "num_base_bdevs": 3, 00:15:43.727 "num_base_bdevs_discovered": 3, 00:15:43.727 "num_base_bdevs_operational": 3, 00:15:43.727 "base_bdevs_list": [ 00:15:43.727 { 00:15:43.727 "name": "BaseBdev1", 00:15:43.727 "uuid": "1de10ac0-53e1-5231-a840-d5db2413718d", 00:15:43.727 "is_configured": true, 00:15:43.727 "data_offset": 2048, 00:15:43.727 "data_size": 63488 00:15:43.727 }, 00:15:43.727 { 00:15:43.727 "name": "BaseBdev2", 00:15:43.727 "uuid": "3a42c39e-8c8e-56a4-a976-c47cb61bf0d2", 00:15:43.727 "is_configured": true, 00:15:43.727 "data_offset": 2048, 00:15:43.727 "data_size": 63488 00:15:43.727 }, 00:15:43.727 { 00:15:43.727 "name": "BaseBdev3", 00:15:43.727 "uuid": "73d3c329-d309-569f-bd4a-4c18661b6900", 00:15:43.727 "is_configured": true, 00:15:43.727 "data_offset": 2048, 00:15:43.727 "data_size": 63488 00:15:43.727 } 00:15:43.727 ] 00:15:43.727 }' 00:15:43.727 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:43.727 13:41:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:44.295 13:41:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:44.554 [2024-07-12 13:41:33.079172] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:44.554 [2024-07-12 13:41:33.079212] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:44.554 [2024-07-12 13:41:33.082367] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:44.554 [2024-07-12 13:41:33.082406] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:44.555 [2024-07-12 13:41:33.082442] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:44.555 [2024-07-12 13:41:33.082460] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfecbc0 name raid_bdev1, state offline 00:15:44.555 0 00:15:44.555 13:41:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 472484 00:15:44.555 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 472484 ']' 00:15:44.555 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 472484 00:15:44.555 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:44.555 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:44.555 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 472484 00:15:44.814 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:44.814 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:44.814 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 472484' 00:15:44.814 killing process with pid 472484 00:15:44.814 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 472484 00:15:44.814 [2024-07-12 13:41:33.163352] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:44.814 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 472484 00:15:44.814 [2024-07-12 13:41:33.184661] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:45.073 13:41:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.tMXV5LMRQA 00:15:45.073 13:41:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:45.073 13:41:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:45.073 13:41:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:15:45.073 13:41:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:45.073 13:41:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:45.073 13:41:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:45.073 13:41:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:15:45.073 00:15:45.073 real 0m7.856s 00:15:45.073 user 0m12.686s 00:15:45.073 sys 0m1.379s 00:15:45.073 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:45.073 13:41:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.073 ************************************ 00:15:45.073 END TEST raid_read_error_test 00:15:45.073 ************************************ 00:15:45.073 13:41:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:45.073 13:41:33 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:15:45.073 13:41:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:45.073 13:41:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:45.073 13:41:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:45.073 ************************************ 00:15:45.073 START TEST raid_write_error_test 00:15:45.073 ************************************ 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:45.073 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.cLMwmMlZvs 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=473607 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 473607 /var/tmp/spdk-raid.sock 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 473607 ']' 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:45.074 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:45.074 13:41:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.074 [2024-07-12 13:41:33.588849] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:15:45.074 [2024-07-12 13:41:33.588920] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid473607 ] 00:15:45.332 [2024-07-12 13:41:33.718424] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:45.332 [2024-07-12 13:41:33.827002] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:45.332 [2024-07-12 13:41:33.889852] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:45.332 [2024-07-12 13:41:33.889880] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:46.267 13:41:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:46.267 13:41:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:46.267 13:41:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:46.267 13:41:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:46.267 BaseBdev1_malloc 00:15:46.267 13:41:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:46.524 true 00:15:46.524 13:41:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:46.783 [2024-07-12 13:41:35.245268] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:46.783 [2024-07-12 13:41:35.245312] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:46.783 [2024-07-12 13:41:35.245338] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1390a10 00:15:46.783 [2024-07-12 13:41:35.245352] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:46.783 [2024-07-12 13:41:35.247195] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:46.783 [2024-07-12 13:41:35.247224] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:46.783 BaseBdev1 00:15:46.783 13:41:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:46.783 13:41:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:47.041 BaseBdev2_malloc 00:15:47.041 13:41:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:47.299 true 00:15:47.299 13:41:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:47.557 [2024-07-12 13:41:35.983810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:47.557 [2024-07-12 13:41:35.983855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.557 [2024-07-12 13:41:35.983882] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1395250 00:15:47.557 [2024-07-12 13:41:35.983895] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.557 [2024-07-12 13:41:35.985546] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.557 [2024-07-12 13:41:35.985576] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:47.557 BaseBdev2 00:15:47.557 13:41:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:47.557 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:47.815 BaseBdev3_malloc 00:15:47.815 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:48.074 true 00:15:48.074 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:48.332 [2024-07-12 13:41:36.707535] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:48.332 [2024-07-12 13:41:36.707581] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:48.332 [2024-07-12 13:41:36.707609] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1397510 00:15:48.332 [2024-07-12 13:41:36.707622] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:48.332 [2024-07-12 13:41:36.709233] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:48.332 [2024-07-12 13:41:36.709262] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:48.332 BaseBdev3 00:15:48.332 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:48.591 [2024-07-12 13:41:36.952207] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:48.591 [2024-07-12 13:41:36.953555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:48.591 [2024-07-12 13:41:36.953626] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:48.591 [2024-07-12 13:41:36.953830] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1398bc0 00:15:48.591 [2024-07-12 13:41:36.953841] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:48.591 [2024-07-12 13:41:36.954047] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1398760 00:15:48.591 [2024-07-12 13:41:36.954195] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1398bc0 00:15:48.591 [2024-07-12 13:41:36.954206] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1398bc0 00:15:48.591 [2024-07-12 13:41:36.954307] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:48.591 13:41:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.850 13:41:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.850 "name": "raid_bdev1", 00:15:48.850 "uuid": "982cafea-98e1-4e2e-852f-af1d48586675", 00:15:48.850 "strip_size_kb": 64, 00:15:48.850 "state": "online", 00:15:48.850 "raid_level": "concat", 00:15:48.850 "superblock": true, 00:15:48.850 "num_base_bdevs": 3, 00:15:48.850 "num_base_bdevs_discovered": 3, 00:15:48.850 "num_base_bdevs_operational": 3, 00:15:48.850 "base_bdevs_list": [ 00:15:48.850 { 00:15:48.850 "name": "BaseBdev1", 00:15:48.850 "uuid": "e89dd1fa-d3ff-5ffe-a158-0d2a94b16d68", 00:15:48.850 "is_configured": true, 00:15:48.850 "data_offset": 2048, 00:15:48.850 "data_size": 63488 00:15:48.850 }, 00:15:48.850 { 00:15:48.850 "name": "BaseBdev2", 00:15:48.850 "uuid": "5233e88e-bbbe-51c0-9da9-d876e7987300", 00:15:48.850 "is_configured": true, 00:15:48.850 "data_offset": 2048, 00:15:48.850 "data_size": 63488 00:15:48.850 }, 00:15:48.850 { 00:15:48.850 "name": "BaseBdev3", 00:15:48.850 "uuid": "0a66a3d4-a06f-5cb5-b315-c2b12bf85632", 00:15:48.850 "is_configured": true, 00:15:48.850 "data_offset": 2048, 00:15:48.850 "data_size": 63488 00:15:48.850 } 00:15:48.850 ] 00:15:48.850 }' 00:15:48.850 13:41:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.850 13:41:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.418 13:41:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:49.418 13:41:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:49.418 [2024-07-12 13:41:37.951143] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e6e10 00:15:50.354 13:41:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.613 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:50.872 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.872 "name": "raid_bdev1", 00:15:50.872 "uuid": "982cafea-98e1-4e2e-852f-af1d48586675", 00:15:50.872 "strip_size_kb": 64, 00:15:50.872 "state": "online", 00:15:50.872 "raid_level": "concat", 00:15:50.872 "superblock": true, 00:15:50.872 "num_base_bdevs": 3, 00:15:50.872 "num_base_bdevs_discovered": 3, 00:15:50.872 "num_base_bdevs_operational": 3, 00:15:50.872 "base_bdevs_list": [ 00:15:50.872 { 00:15:50.872 "name": "BaseBdev1", 00:15:50.872 "uuid": "e89dd1fa-d3ff-5ffe-a158-0d2a94b16d68", 00:15:50.872 "is_configured": true, 00:15:50.872 "data_offset": 2048, 00:15:50.872 "data_size": 63488 00:15:50.872 }, 00:15:50.872 { 00:15:50.872 "name": "BaseBdev2", 00:15:50.872 "uuid": "5233e88e-bbbe-51c0-9da9-d876e7987300", 00:15:50.872 "is_configured": true, 00:15:50.872 "data_offset": 2048, 00:15:50.872 "data_size": 63488 00:15:50.872 }, 00:15:50.872 { 00:15:50.872 "name": "BaseBdev3", 00:15:50.872 "uuid": "0a66a3d4-a06f-5cb5-b315-c2b12bf85632", 00:15:50.872 "is_configured": true, 00:15:50.872 "data_offset": 2048, 00:15:50.872 "data_size": 63488 00:15:50.872 } 00:15:50.872 ] 00:15:50.872 }' 00:15:50.872 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.872 13:41:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.439 13:41:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:51.697 [2024-07-12 13:41:40.035773] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:51.697 [2024-07-12 13:41:40.035809] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:51.697 [2024-07-12 13:41:40.038978] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:51.697 [2024-07-12 13:41:40.039015] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:51.697 [2024-07-12 13:41:40.039050] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:51.697 [2024-07-12 13:41:40.039061] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1398bc0 name raid_bdev1, state offline 00:15:51.697 0 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 473607 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 473607 ']' 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 473607 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 473607 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 473607' 00:15:51.697 killing process with pid 473607 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 473607 00:15:51.697 [2024-07-12 13:41:40.120020] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:51.697 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 473607 00:15:51.697 [2024-07-12 13:41:40.140712] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:51.956 13:41:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.cLMwmMlZvs 00:15:51.956 13:41:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:51.956 13:41:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:51.956 13:41:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.48 00:15:51.956 13:41:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:51.956 13:41:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:51.956 13:41:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:51.956 13:41:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.48 != \0\.\0\0 ]] 00:15:51.956 00:15:51.956 real 0m6.870s 00:15:51.956 user 0m10.842s 00:15:51.956 sys 0m1.219s 00:15:51.956 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:51.956 13:41:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.956 ************************************ 00:15:51.956 END TEST raid_write_error_test 00:15:51.956 ************************************ 00:15:51.956 13:41:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:51.956 13:41:40 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:51.956 13:41:40 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:15:51.956 13:41:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:51.956 13:41:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:51.956 13:41:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:51.956 ************************************ 00:15:51.956 START TEST raid_state_function_test 00:15:51.956 ************************************ 00:15:51.956 13:41:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:15:51.956 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:51.956 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:51.956 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:51.956 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:51.956 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:51.956 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=474606 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 474606' 00:15:51.957 Process raid pid: 474606 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 474606 /var/tmp/spdk-raid.sock 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 474606 ']' 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:51.957 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:51.957 13:41:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.957 [2024-07-12 13:41:40.537910] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:15:51.957 [2024-07-12 13:41:40.537994] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:52.215 [2024-07-12 13:41:40.670248] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:52.215 [2024-07-12 13:41:40.774163] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:52.474 [2024-07-12 13:41:40.842099] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:52.474 [2024-07-12 13:41:40.842138] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:53.041 13:41:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:53.041 13:41:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:53.041 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:53.301 [2024-07-12 13:41:41.701239] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:53.301 [2024-07-12 13:41:41.701284] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:53.301 [2024-07-12 13:41:41.701295] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:53.301 [2024-07-12 13:41:41.701307] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:53.301 [2024-07-12 13:41:41.701316] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:53.301 [2024-07-12 13:41:41.701327] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.301 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:53.560 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:53.560 "name": "Existed_Raid", 00:15:53.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:53.560 "strip_size_kb": 0, 00:15:53.560 "state": "configuring", 00:15:53.560 "raid_level": "raid1", 00:15:53.560 "superblock": false, 00:15:53.560 "num_base_bdevs": 3, 00:15:53.560 "num_base_bdevs_discovered": 0, 00:15:53.560 "num_base_bdevs_operational": 3, 00:15:53.560 "base_bdevs_list": [ 00:15:53.560 { 00:15:53.560 "name": "BaseBdev1", 00:15:53.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:53.560 "is_configured": false, 00:15:53.560 "data_offset": 0, 00:15:53.560 "data_size": 0 00:15:53.560 }, 00:15:53.560 { 00:15:53.560 "name": "BaseBdev2", 00:15:53.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:53.560 "is_configured": false, 00:15:53.560 "data_offset": 0, 00:15:53.560 "data_size": 0 00:15:53.560 }, 00:15:53.560 { 00:15:53.560 "name": "BaseBdev3", 00:15:53.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:53.560 "is_configured": false, 00:15:53.560 "data_offset": 0, 00:15:53.560 "data_size": 0 00:15:53.560 } 00:15:53.560 ] 00:15:53.560 }' 00:15:53.560 13:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:53.560 13:41:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.126 13:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:54.384 [2024-07-12 13:41:42.804017] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:54.384 [2024-07-12 13:41:42.804049] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1707350 name Existed_Raid, state configuring 00:15:54.384 13:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:54.645 [2024-07-12 13:41:43.052680] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:54.645 [2024-07-12 13:41:43.052709] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:54.645 [2024-07-12 13:41:43.052719] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:54.645 [2024-07-12 13:41:43.052730] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:54.645 [2024-07-12 13:41:43.052739] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:54.645 [2024-07-12 13:41:43.052750] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:54.645 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:54.903 [2024-07-12 13:41:43.319314] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:54.903 BaseBdev1 00:15:54.903 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:54.903 13:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:54.903 13:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:54.903 13:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:54.903 13:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:54.903 13:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:54.903 13:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:55.162 13:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:55.421 [ 00:15:55.421 { 00:15:55.421 "name": "BaseBdev1", 00:15:55.421 "aliases": [ 00:15:55.421 "47118484-25fd-4fb5-bee0-5ed9e61c25af" 00:15:55.421 ], 00:15:55.421 "product_name": "Malloc disk", 00:15:55.421 "block_size": 512, 00:15:55.421 "num_blocks": 65536, 00:15:55.421 "uuid": "47118484-25fd-4fb5-bee0-5ed9e61c25af", 00:15:55.421 "assigned_rate_limits": { 00:15:55.421 "rw_ios_per_sec": 0, 00:15:55.421 "rw_mbytes_per_sec": 0, 00:15:55.421 "r_mbytes_per_sec": 0, 00:15:55.421 "w_mbytes_per_sec": 0 00:15:55.421 }, 00:15:55.421 "claimed": true, 00:15:55.421 "claim_type": "exclusive_write", 00:15:55.421 "zoned": false, 00:15:55.421 "supported_io_types": { 00:15:55.421 "read": true, 00:15:55.421 "write": true, 00:15:55.421 "unmap": true, 00:15:55.421 "flush": true, 00:15:55.421 "reset": true, 00:15:55.421 "nvme_admin": false, 00:15:55.421 "nvme_io": false, 00:15:55.421 "nvme_io_md": false, 00:15:55.421 "write_zeroes": true, 00:15:55.421 "zcopy": true, 00:15:55.421 "get_zone_info": false, 00:15:55.421 "zone_management": false, 00:15:55.421 "zone_append": false, 00:15:55.421 "compare": false, 00:15:55.421 "compare_and_write": false, 00:15:55.421 "abort": true, 00:15:55.421 "seek_hole": false, 00:15:55.421 "seek_data": false, 00:15:55.421 "copy": true, 00:15:55.421 "nvme_iov_md": false 00:15:55.421 }, 00:15:55.421 "memory_domains": [ 00:15:55.421 { 00:15:55.421 "dma_device_id": "system", 00:15:55.421 "dma_device_type": 1 00:15:55.421 }, 00:15:55.421 { 00:15:55.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.421 "dma_device_type": 2 00:15:55.421 } 00:15:55.421 ], 00:15:55.421 "driver_specific": {} 00:15:55.421 } 00:15:55.421 ] 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.421 13:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:55.679 13:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.679 "name": "Existed_Raid", 00:15:55.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.679 "strip_size_kb": 0, 00:15:55.679 "state": "configuring", 00:15:55.679 "raid_level": "raid1", 00:15:55.679 "superblock": false, 00:15:55.679 "num_base_bdevs": 3, 00:15:55.679 "num_base_bdevs_discovered": 1, 00:15:55.679 "num_base_bdevs_operational": 3, 00:15:55.679 "base_bdevs_list": [ 00:15:55.679 { 00:15:55.679 "name": "BaseBdev1", 00:15:55.679 "uuid": "47118484-25fd-4fb5-bee0-5ed9e61c25af", 00:15:55.679 "is_configured": true, 00:15:55.679 "data_offset": 0, 00:15:55.679 "data_size": 65536 00:15:55.679 }, 00:15:55.679 { 00:15:55.679 "name": "BaseBdev2", 00:15:55.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.679 "is_configured": false, 00:15:55.679 "data_offset": 0, 00:15:55.679 "data_size": 0 00:15:55.679 }, 00:15:55.679 { 00:15:55.679 "name": "BaseBdev3", 00:15:55.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.679 "is_configured": false, 00:15:55.679 "data_offset": 0, 00:15:55.679 "data_size": 0 00:15:55.679 } 00:15:55.679 ] 00:15:55.679 }' 00:15:55.679 13:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.679 13:41:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.246 13:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:56.505 [2024-07-12 13:41:44.959658] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:56.505 [2024-07-12 13:41:44.959700] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1706c20 name Existed_Raid, state configuring 00:15:56.505 13:41:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:56.764 [2024-07-12 13:41:45.208341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:56.764 [2024-07-12 13:41:45.209818] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:56.764 [2024-07-12 13:41:45.209852] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:56.764 [2024-07-12 13:41:45.209862] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:56.764 [2024-07-12 13:41:45.209873] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.764 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:57.022 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.022 "name": "Existed_Raid", 00:15:57.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.022 "strip_size_kb": 0, 00:15:57.022 "state": "configuring", 00:15:57.022 "raid_level": "raid1", 00:15:57.022 "superblock": false, 00:15:57.022 "num_base_bdevs": 3, 00:15:57.022 "num_base_bdevs_discovered": 1, 00:15:57.022 "num_base_bdevs_operational": 3, 00:15:57.022 "base_bdevs_list": [ 00:15:57.022 { 00:15:57.022 "name": "BaseBdev1", 00:15:57.022 "uuid": "47118484-25fd-4fb5-bee0-5ed9e61c25af", 00:15:57.022 "is_configured": true, 00:15:57.022 "data_offset": 0, 00:15:57.022 "data_size": 65536 00:15:57.022 }, 00:15:57.022 { 00:15:57.022 "name": "BaseBdev2", 00:15:57.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.022 "is_configured": false, 00:15:57.022 "data_offset": 0, 00:15:57.022 "data_size": 0 00:15:57.022 }, 00:15:57.022 { 00:15:57.022 "name": "BaseBdev3", 00:15:57.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.022 "is_configured": false, 00:15:57.022 "data_offset": 0, 00:15:57.022 "data_size": 0 00:15:57.022 } 00:15:57.022 ] 00:15:57.022 }' 00:15:57.022 13:41:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.022 13:41:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.588 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:57.847 [2024-07-12 13:41:46.318720] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:57.847 BaseBdev2 00:15:57.847 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:57.847 13:41:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:57.847 13:41:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:57.847 13:41:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:57.847 13:41:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:57.847 13:41:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:57.847 13:41:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:58.105 13:41:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:58.364 [ 00:15:58.364 { 00:15:58.364 "name": "BaseBdev2", 00:15:58.364 "aliases": [ 00:15:58.364 "41ee949b-1460-4b96-8bfa-a1492c6f342b" 00:15:58.364 ], 00:15:58.364 "product_name": "Malloc disk", 00:15:58.364 "block_size": 512, 00:15:58.364 "num_blocks": 65536, 00:15:58.364 "uuid": "41ee949b-1460-4b96-8bfa-a1492c6f342b", 00:15:58.364 "assigned_rate_limits": { 00:15:58.364 "rw_ios_per_sec": 0, 00:15:58.364 "rw_mbytes_per_sec": 0, 00:15:58.364 "r_mbytes_per_sec": 0, 00:15:58.364 "w_mbytes_per_sec": 0 00:15:58.364 }, 00:15:58.364 "claimed": true, 00:15:58.364 "claim_type": "exclusive_write", 00:15:58.364 "zoned": false, 00:15:58.364 "supported_io_types": { 00:15:58.364 "read": true, 00:15:58.364 "write": true, 00:15:58.364 "unmap": true, 00:15:58.364 "flush": true, 00:15:58.364 "reset": true, 00:15:58.364 "nvme_admin": false, 00:15:58.364 "nvme_io": false, 00:15:58.364 "nvme_io_md": false, 00:15:58.364 "write_zeroes": true, 00:15:58.364 "zcopy": true, 00:15:58.364 "get_zone_info": false, 00:15:58.364 "zone_management": false, 00:15:58.364 "zone_append": false, 00:15:58.364 "compare": false, 00:15:58.364 "compare_and_write": false, 00:15:58.364 "abort": true, 00:15:58.364 "seek_hole": false, 00:15:58.364 "seek_data": false, 00:15:58.364 "copy": true, 00:15:58.364 "nvme_iov_md": false 00:15:58.364 }, 00:15:58.364 "memory_domains": [ 00:15:58.364 { 00:15:58.364 "dma_device_id": "system", 00:15:58.364 "dma_device_type": 1 00:15:58.364 }, 00:15:58.364 { 00:15:58.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.364 "dma_device_type": 2 00:15:58.364 } 00:15:58.364 ], 00:15:58.364 "driver_specific": {} 00:15:58.364 } 00:15:58.364 ] 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:58.364 13:41:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.624 13:41:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.624 "name": "Existed_Raid", 00:15:58.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.624 "strip_size_kb": 0, 00:15:58.624 "state": "configuring", 00:15:58.624 "raid_level": "raid1", 00:15:58.624 "superblock": false, 00:15:58.624 "num_base_bdevs": 3, 00:15:58.624 "num_base_bdevs_discovered": 2, 00:15:58.624 "num_base_bdevs_operational": 3, 00:15:58.624 "base_bdevs_list": [ 00:15:58.624 { 00:15:58.624 "name": "BaseBdev1", 00:15:58.624 "uuid": "47118484-25fd-4fb5-bee0-5ed9e61c25af", 00:15:58.624 "is_configured": true, 00:15:58.624 "data_offset": 0, 00:15:58.624 "data_size": 65536 00:15:58.624 }, 00:15:58.624 { 00:15:58.624 "name": "BaseBdev2", 00:15:58.624 "uuid": "41ee949b-1460-4b96-8bfa-a1492c6f342b", 00:15:58.624 "is_configured": true, 00:15:58.624 "data_offset": 0, 00:15:58.624 "data_size": 65536 00:15:58.624 }, 00:15:58.624 { 00:15:58.624 "name": "BaseBdev3", 00:15:58.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.624 "is_configured": false, 00:15:58.624 "data_offset": 0, 00:15:58.624 "data_size": 0 00:15:58.624 } 00:15:58.624 ] 00:15:58.624 }' 00:15:58.624 13:41:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.624 13:41:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:59.191 13:41:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:59.451 [2024-07-12 13:41:47.926497] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:59.451 [2024-07-12 13:41:47.926542] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1707b10 00:15:59.451 [2024-07-12 13:41:47.926551] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:15:59.451 [2024-07-12 13:41:47.926741] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17077e0 00:15:59.451 [2024-07-12 13:41:47.926866] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1707b10 00:15:59.451 [2024-07-12 13:41:47.926876] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1707b10 00:15:59.451 [2024-07-12 13:41:47.927054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:59.451 BaseBdev3 00:15:59.451 13:41:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:59.451 13:41:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:59.451 13:41:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:59.451 13:41:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:59.451 13:41:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:59.451 13:41:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:59.451 13:41:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:59.709 13:41:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:59.967 [ 00:15:59.967 { 00:15:59.967 "name": "BaseBdev3", 00:15:59.967 "aliases": [ 00:15:59.967 "2b7c74d2-1b2e-4c73-87eb-b9b728cfba23" 00:15:59.967 ], 00:15:59.967 "product_name": "Malloc disk", 00:15:59.967 "block_size": 512, 00:15:59.968 "num_blocks": 65536, 00:15:59.968 "uuid": "2b7c74d2-1b2e-4c73-87eb-b9b728cfba23", 00:15:59.968 "assigned_rate_limits": { 00:15:59.968 "rw_ios_per_sec": 0, 00:15:59.968 "rw_mbytes_per_sec": 0, 00:15:59.968 "r_mbytes_per_sec": 0, 00:15:59.968 "w_mbytes_per_sec": 0 00:15:59.968 }, 00:15:59.968 "claimed": true, 00:15:59.968 "claim_type": "exclusive_write", 00:15:59.968 "zoned": false, 00:15:59.968 "supported_io_types": { 00:15:59.968 "read": true, 00:15:59.968 "write": true, 00:15:59.968 "unmap": true, 00:15:59.968 "flush": true, 00:15:59.968 "reset": true, 00:15:59.968 "nvme_admin": false, 00:15:59.968 "nvme_io": false, 00:15:59.968 "nvme_io_md": false, 00:15:59.968 "write_zeroes": true, 00:15:59.968 "zcopy": true, 00:15:59.968 "get_zone_info": false, 00:15:59.968 "zone_management": false, 00:15:59.968 "zone_append": false, 00:15:59.968 "compare": false, 00:15:59.968 "compare_and_write": false, 00:15:59.968 "abort": true, 00:15:59.968 "seek_hole": false, 00:15:59.968 "seek_data": false, 00:15:59.968 "copy": true, 00:15:59.968 "nvme_iov_md": false 00:15:59.968 }, 00:15:59.968 "memory_domains": [ 00:15:59.968 { 00:15:59.968 "dma_device_id": "system", 00:15:59.968 "dma_device_type": 1 00:15:59.968 }, 00:15:59.968 { 00:15:59.968 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.968 "dma_device_type": 2 00:15:59.968 } 00:15:59.968 ], 00:15:59.968 "driver_specific": {} 00:15:59.968 } 00:15:59.968 ] 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.968 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.226 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.226 "name": "Existed_Raid", 00:16:00.226 "uuid": "aa575cef-e8b8-48d0-b0a0-a58d50c80497", 00:16:00.226 "strip_size_kb": 0, 00:16:00.226 "state": "online", 00:16:00.226 "raid_level": "raid1", 00:16:00.226 "superblock": false, 00:16:00.226 "num_base_bdevs": 3, 00:16:00.226 "num_base_bdevs_discovered": 3, 00:16:00.226 "num_base_bdevs_operational": 3, 00:16:00.226 "base_bdevs_list": [ 00:16:00.226 { 00:16:00.226 "name": "BaseBdev1", 00:16:00.226 "uuid": "47118484-25fd-4fb5-bee0-5ed9e61c25af", 00:16:00.226 "is_configured": true, 00:16:00.226 "data_offset": 0, 00:16:00.226 "data_size": 65536 00:16:00.226 }, 00:16:00.226 { 00:16:00.226 "name": "BaseBdev2", 00:16:00.226 "uuid": "41ee949b-1460-4b96-8bfa-a1492c6f342b", 00:16:00.226 "is_configured": true, 00:16:00.226 "data_offset": 0, 00:16:00.226 "data_size": 65536 00:16:00.226 }, 00:16:00.226 { 00:16:00.226 "name": "BaseBdev3", 00:16:00.226 "uuid": "2b7c74d2-1b2e-4c73-87eb-b9b728cfba23", 00:16:00.226 "is_configured": true, 00:16:00.226 "data_offset": 0, 00:16:00.226 "data_size": 65536 00:16:00.226 } 00:16:00.226 ] 00:16:00.226 }' 00:16:00.226 13:41:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.226 13:41:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.792 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:00.792 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:00.792 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:00.792 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:00.792 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:00.792 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:00.792 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:00.792 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:01.051 [2024-07-12 13:41:49.394716] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:01.051 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:01.051 "name": "Existed_Raid", 00:16:01.051 "aliases": [ 00:16:01.051 "aa575cef-e8b8-48d0-b0a0-a58d50c80497" 00:16:01.051 ], 00:16:01.051 "product_name": "Raid Volume", 00:16:01.051 "block_size": 512, 00:16:01.051 "num_blocks": 65536, 00:16:01.051 "uuid": "aa575cef-e8b8-48d0-b0a0-a58d50c80497", 00:16:01.051 "assigned_rate_limits": { 00:16:01.051 "rw_ios_per_sec": 0, 00:16:01.051 "rw_mbytes_per_sec": 0, 00:16:01.051 "r_mbytes_per_sec": 0, 00:16:01.051 "w_mbytes_per_sec": 0 00:16:01.051 }, 00:16:01.051 "claimed": false, 00:16:01.051 "zoned": false, 00:16:01.051 "supported_io_types": { 00:16:01.051 "read": true, 00:16:01.051 "write": true, 00:16:01.051 "unmap": false, 00:16:01.051 "flush": false, 00:16:01.051 "reset": true, 00:16:01.051 "nvme_admin": false, 00:16:01.051 "nvme_io": false, 00:16:01.051 "nvme_io_md": false, 00:16:01.051 "write_zeroes": true, 00:16:01.051 "zcopy": false, 00:16:01.051 "get_zone_info": false, 00:16:01.051 "zone_management": false, 00:16:01.051 "zone_append": false, 00:16:01.051 "compare": false, 00:16:01.051 "compare_and_write": false, 00:16:01.051 "abort": false, 00:16:01.051 "seek_hole": false, 00:16:01.051 "seek_data": false, 00:16:01.051 "copy": false, 00:16:01.051 "nvme_iov_md": false 00:16:01.051 }, 00:16:01.051 "memory_domains": [ 00:16:01.051 { 00:16:01.051 "dma_device_id": "system", 00:16:01.051 "dma_device_type": 1 00:16:01.051 }, 00:16:01.051 { 00:16:01.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.051 "dma_device_type": 2 00:16:01.051 }, 00:16:01.051 { 00:16:01.051 "dma_device_id": "system", 00:16:01.051 "dma_device_type": 1 00:16:01.051 }, 00:16:01.051 { 00:16:01.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.051 "dma_device_type": 2 00:16:01.051 }, 00:16:01.051 { 00:16:01.051 "dma_device_id": "system", 00:16:01.051 "dma_device_type": 1 00:16:01.051 }, 00:16:01.051 { 00:16:01.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.051 "dma_device_type": 2 00:16:01.051 } 00:16:01.051 ], 00:16:01.051 "driver_specific": { 00:16:01.051 "raid": { 00:16:01.051 "uuid": "aa575cef-e8b8-48d0-b0a0-a58d50c80497", 00:16:01.051 "strip_size_kb": 0, 00:16:01.051 "state": "online", 00:16:01.051 "raid_level": "raid1", 00:16:01.051 "superblock": false, 00:16:01.051 "num_base_bdevs": 3, 00:16:01.051 "num_base_bdevs_discovered": 3, 00:16:01.051 "num_base_bdevs_operational": 3, 00:16:01.051 "base_bdevs_list": [ 00:16:01.051 { 00:16:01.051 "name": "BaseBdev1", 00:16:01.051 "uuid": "47118484-25fd-4fb5-bee0-5ed9e61c25af", 00:16:01.051 "is_configured": true, 00:16:01.051 "data_offset": 0, 00:16:01.051 "data_size": 65536 00:16:01.051 }, 00:16:01.051 { 00:16:01.051 "name": "BaseBdev2", 00:16:01.051 "uuid": "41ee949b-1460-4b96-8bfa-a1492c6f342b", 00:16:01.051 "is_configured": true, 00:16:01.051 "data_offset": 0, 00:16:01.051 "data_size": 65536 00:16:01.051 }, 00:16:01.051 { 00:16:01.051 "name": "BaseBdev3", 00:16:01.051 "uuid": "2b7c74d2-1b2e-4c73-87eb-b9b728cfba23", 00:16:01.051 "is_configured": true, 00:16:01.051 "data_offset": 0, 00:16:01.051 "data_size": 65536 00:16:01.051 } 00:16:01.051 ] 00:16:01.051 } 00:16:01.051 } 00:16:01.051 }' 00:16:01.051 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:01.051 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:01.051 BaseBdev2 00:16:01.051 BaseBdev3' 00:16:01.051 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:01.051 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:01.051 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:01.617 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:01.617 "name": "BaseBdev1", 00:16:01.617 "aliases": [ 00:16:01.617 "47118484-25fd-4fb5-bee0-5ed9e61c25af" 00:16:01.617 ], 00:16:01.617 "product_name": "Malloc disk", 00:16:01.617 "block_size": 512, 00:16:01.617 "num_blocks": 65536, 00:16:01.618 "uuid": "47118484-25fd-4fb5-bee0-5ed9e61c25af", 00:16:01.618 "assigned_rate_limits": { 00:16:01.618 "rw_ios_per_sec": 0, 00:16:01.618 "rw_mbytes_per_sec": 0, 00:16:01.618 "r_mbytes_per_sec": 0, 00:16:01.618 "w_mbytes_per_sec": 0 00:16:01.618 }, 00:16:01.618 "claimed": true, 00:16:01.618 "claim_type": "exclusive_write", 00:16:01.618 "zoned": false, 00:16:01.618 "supported_io_types": { 00:16:01.618 "read": true, 00:16:01.618 "write": true, 00:16:01.618 "unmap": true, 00:16:01.618 "flush": true, 00:16:01.618 "reset": true, 00:16:01.618 "nvme_admin": false, 00:16:01.618 "nvme_io": false, 00:16:01.618 "nvme_io_md": false, 00:16:01.618 "write_zeroes": true, 00:16:01.618 "zcopy": true, 00:16:01.618 "get_zone_info": false, 00:16:01.618 "zone_management": false, 00:16:01.618 "zone_append": false, 00:16:01.618 "compare": false, 00:16:01.618 "compare_and_write": false, 00:16:01.618 "abort": true, 00:16:01.618 "seek_hole": false, 00:16:01.618 "seek_data": false, 00:16:01.618 "copy": true, 00:16:01.618 "nvme_iov_md": false 00:16:01.618 }, 00:16:01.618 "memory_domains": [ 00:16:01.618 { 00:16:01.618 "dma_device_id": "system", 00:16:01.618 "dma_device_type": 1 00:16:01.618 }, 00:16:01.618 { 00:16:01.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.618 "dma_device_type": 2 00:16:01.618 } 00:16:01.618 ], 00:16:01.618 "driver_specific": {} 00:16:01.618 }' 00:16:01.618 13:41:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:01.618 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:01.618 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:01.618 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:01.618 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:01.618 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:01.618 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:01.876 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:01.876 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:01.876 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.876 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.876 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:01.876 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:01.876 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:01.876 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:02.134 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:02.134 "name": "BaseBdev2", 00:16:02.134 "aliases": [ 00:16:02.134 "41ee949b-1460-4b96-8bfa-a1492c6f342b" 00:16:02.134 ], 00:16:02.134 "product_name": "Malloc disk", 00:16:02.134 "block_size": 512, 00:16:02.134 "num_blocks": 65536, 00:16:02.134 "uuid": "41ee949b-1460-4b96-8bfa-a1492c6f342b", 00:16:02.134 "assigned_rate_limits": { 00:16:02.134 "rw_ios_per_sec": 0, 00:16:02.134 "rw_mbytes_per_sec": 0, 00:16:02.134 "r_mbytes_per_sec": 0, 00:16:02.134 "w_mbytes_per_sec": 0 00:16:02.134 }, 00:16:02.134 "claimed": true, 00:16:02.134 "claim_type": "exclusive_write", 00:16:02.134 "zoned": false, 00:16:02.134 "supported_io_types": { 00:16:02.134 "read": true, 00:16:02.134 "write": true, 00:16:02.134 "unmap": true, 00:16:02.134 "flush": true, 00:16:02.134 "reset": true, 00:16:02.134 "nvme_admin": false, 00:16:02.134 "nvme_io": false, 00:16:02.134 "nvme_io_md": false, 00:16:02.134 "write_zeroes": true, 00:16:02.134 "zcopy": true, 00:16:02.134 "get_zone_info": false, 00:16:02.134 "zone_management": false, 00:16:02.134 "zone_append": false, 00:16:02.134 "compare": false, 00:16:02.134 "compare_and_write": false, 00:16:02.134 "abort": true, 00:16:02.134 "seek_hole": false, 00:16:02.134 "seek_data": false, 00:16:02.134 "copy": true, 00:16:02.134 "nvme_iov_md": false 00:16:02.134 }, 00:16:02.134 "memory_domains": [ 00:16:02.134 { 00:16:02.134 "dma_device_id": "system", 00:16:02.134 "dma_device_type": 1 00:16:02.134 }, 00:16:02.134 { 00:16:02.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.134 "dma_device_type": 2 00:16:02.134 } 00:16:02.134 ], 00:16:02.134 "driver_specific": {} 00:16:02.134 }' 00:16:02.134 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:02.134 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:02.134 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:02.134 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:02.134 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:02.134 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:02.134 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:02.392 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:02.392 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:02.392 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:02.392 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:02.392 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:02.392 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:02.392 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:02.392 13:41:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:02.656 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:02.656 "name": "BaseBdev3", 00:16:02.656 "aliases": [ 00:16:02.656 "2b7c74d2-1b2e-4c73-87eb-b9b728cfba23" 00:16:02.656 ], 00:16:02.656 "product_name": "Malloc disk", 00:16:02.656 "block_size": 512, 00:16:02.656 "num_blocks": 65536, 00:16:02.656 "uuid": "2b7c74d2-1b2e-4c73-87eb-b9b728cfba23", 00:16:02.656 "assigned_rate_limits": { 00:16:02.656 "rw_ios_per_sec": 0, 00:16:02.656 "rw_mbytes_per_sec": 0, 00:16:02.656 "r_mbytes_per_sec": 0, 00:16:02.656 "w_mbytes_per_sec": 0 00:16:02.656 }, 00:16:02.656 "claimed": true, 00:16:02.656 "claim_type": "exclusive_write", 00:16:02.656 "zoned": false, 00:16:02.656 "supported_io_types": { 00:16:02.656 "read": true, 00:16:02.656 "write": true, 00:16:02.656 "unmap": true, 00:16:02.656 "flush": true, 00:16:02.656 "reset": true, 00:16:02.656 "nvme_admin": false, 00:16:02.656 "nvme_io": false, 00:16:02.656 "nvme_io_md": false, 00:16:02.656 "write_zeroes": true, 00:16:02.656 "zcopy": true, 00:16:02.656 "get_zone_info": false, 00:16:02.656 "zone_management": false, 00:16:02.656 "zone_append": false, 00:16:02.656 "compare": false, 00:16:02.656 "compare_and_write": false, 00:16:02.656 "abort": true, 00:16:02.656 "seek_hole": false, 00:16:02.656 "seek_data": false, 00:16:02.656 "copy": true, 00:16:02.656 "nvme_iov_md": false 00:16:02.656 }, 00:16:02.656 "memory_domains": [ 00:16:02.656 { 00:16:02.656 "dma_device_id": "system", 00:16:02.656 "dma_device_type": 1 00:16:02.656 }, 00:16:02.656 { 00:16:02.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.656 "dma_device_type": 2 00:16:02.656 } 00:16:02.656 ], 00:16:02.656 "driver_specific": {} 00:16:02.656 }' 00:16:02.656 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:02.656 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:02.656 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:02.656 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:02.656 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:02.916 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:02.916 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:02.916 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:02.916 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:02.916 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:02.916 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:02.916 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:02.916 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:03.175 [2024-07-12 13:41:51.660484] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.175 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.434 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.434 "name": "Existed_Raid", 00:16:03.434 "uuid": "aa575cef-e8b8-48d0-b0a0-a58d50c80497", 00:16:03.434 "strip_size_kb": 0, 00:16:03.434 "state": "online", 00:16:03.434 "raid_level": "raid1", 00:16:03.434 "superblock": false, 00:16:03.434 "num_base_bdevs": 3, 00:16:03.434 "num_base_bdevs_discovered": 2, 00:16:03.434 "num_base_bdevs_operational": 2, 00:16:03.434 "base_bdevs_list": [ 00:16:03.434 { 00:16:03.434 "name": null, 00:16:03.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.434 "is_configured": false, 00:16:03.434 "data_offset": 0, 00:16:03.434 "data_size": 65536 00:16:03.434 }, 00:16:03.434 { 00:16:03.434 "name": "BaseBdev2", 00:16:03.434 "uuid": "41ee949b-1460-4b96-8bfa-a1492c6f342b", 00:16:03.434 "is_configured": true, 00:16:03.434 "data_offset": 0, 00:16:03.434 "data_size": 65536 00:16:03.434 }, 00:16:03.434 { 00:16:03.434 "name": "BaseBdev3", 00:16:03.434 "uuid": "2b7c74d2-1b2e-4c73-87eb-b9b728cfba23", 00:16:03.434 "is_configured": true, 00:16:03.434 "data_offset": 0, 00:16:03.434 "data_size": 65536 00:16:03.434 } 00:16:03.434 ] 00:16:03.434 }' 00:16:03.434 13:41:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.434 13:41:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.000 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:04.000 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:04.000 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:04.000 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.257 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:04.257 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:04.257 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:04.514 [2024-07-12 13:41:52.941764] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:04.514 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:04.514 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:04.514 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.514 13:41:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:04.772 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:04.772 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:04.772 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:05.031 [2024-07-12 13:41:53.441677] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:05.031 [2024-07-12 13:41:53.441758] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:05.031 [2024-07-12 13:41:53.452652] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:05.031 [2024-07-12 13:41:53.452695] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:05.031 [2024-07-12 13:41:53.452706] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1707b10 name Existed_Raid, state offline 00:16:05.031 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:05.031 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:05.031 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.031 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:05.290 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:05.290 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:05.290 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:05.290 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:05.290 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:05.290 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:05.549 BaseBdev2 00:16:05.549 13:41:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:05.549 13:41:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:05.549 13:41:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:05.549 13:41:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:05.549 13:41:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:05.549 13:41:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:05.549 13:41:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:05.807 13:41:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:06.066 [ 00:16:06.066 { 00:16:06.066 "name": "BaseBdev2", 00:16:06.066 "aliases": [ 00:16:06.066 "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06" 00:16:06.066 ], 00:16:06.066 "product_name": "Malloc disk", 00:16:06.066 "block_size": 512, 00:16:06.066 "num_blocks": 65536, 00:16:06.066 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:06.066 "assigned_rate_limits": { 00:16:06.066 "rw_ios_per_sec": 0, 00:16:06.066 "rw_mbytes_per_sec": 0, 00:16:06.066 "r_mbytes_per_sec": 0, 00:16:06.066 "w_mbytes_per_sec": 0 00:16:06.066 }, 00:16:06.066 "claimed": false, 00:16:06.066 "zoned": false, 00:16:06.066 "supported_io_types": { 00:16:06.066 "read": true, 00:16:06.066 "write": true, 00:16:06.066 "unmap": true, 00:16:06.066 "flush": true, 00:16:06.066 "reset": true, 00:16:06.066 "nvme_admin": false, 00:16:06.066 "nvme_io": false, 00:16:06.066 "nvme_io_md": false, 00:16:06.066 "write_zeroes": true, 00:16:06.066 "zcopy": true, 00:16:06.066 "get_zone_info": false, 00:16:06.066 "zone_management": false, 00:16:06.066 "zone_append": false, 00:16:06.066 "compare": false, 00:16:06.066 "compare_and_write": false, 00:16:06.066 "abort": true, 00:16:06.066 "seek_hole": false, 00:16:06.066 "seek_data": false, 00:16:06.066 "copy": true, 00:16:06.066 "nvme_iov_md": false 00:16:06.066 }, 00:16:06.066 "memory_domains": [ 00:16:06.066 { 00:16:06.066 "dma_device_id": "system", 00:16:06.066 "dma_device_type": 1 00:16:06.066 }, 00:16:06.066 { 00:16:06.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.066 "dma_device_type": 2 00:16:06.066 } 00:16:06.066 ], 00:16:06.066 "driver_specific": {} 00:16:06.066 } 00:16:06.066 ] 00:16:06.066 13:41:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:06.066 13:41:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:06.066 13:41:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:06.066 13:41:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:06.324 BaseBdev3 00:16:06.324 13:41:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:06.324 13:41:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:06.324 13:41:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:06.324 13:41:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:06.324 13:41:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:06.324 13:41:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:06.324 13:41:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:06.583 13:41:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:06.583 [ 00:16:06.583 { 00:16:06.583 "name": "BaseBdev3", 00:16:06.583 "aliases": [ 00:16:06.583 "1edd294a-c949-47e8-8c64-a86ca019617e" 00:16:06.583 ], 00:16:06.583 "product_name": "Malloc disk", 00:16:06.583 "block_size": 512, 00:16:06.583 "num_blocks": 65536, 00:16:06.583 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:06.583 "assigned_rate_limits": { 00:16:06.583 "rw_ios_per_sec": 0, 00:16:06.583 "rw_mbytes_per_sec": 0, 00:16:06.583 "r_mbytes_per_sec": 0, 00:16:06.583 "w_mbytes_per_sec": 0 00:16:06.583 }, 00:16:06.583 "claimed": false, 00:16:06.583 "zoned": false, 00:16:06.583 "supported_io_types": { 00:16:06.583 "read": true, 00:16:06.583 "write": true, 00:16:06.583 "unmap": true, 00:16:06.583 "flush": true, 00:16:06.583 "reset": true, 00:16:06.583 "nvme_admin": false, 00:16:06.583 "nvme_io": false, 00:16:06.583 "nvme_io_md": false, 00:16:06.583 "write_zeroes": true, 00:16:06.583 "zcopy": true, 00:16:06.583 "get_zone_info": false, 00:16:06.583 "zone_management": false, 00:16:06.583 "zone_append": false, 00:16:06.583 "compare": false, 00:16:06.583 "compare_and_write": false, 00:16:06.583 "abort": true, 00:16:06.583 "seek_hole": false, 00:16:06.583 "seek_data": false, 00:16:06.583 "copy": true, 00:16:06.583 "nvme_iov_md": false 00:16:06.583 }, 00:16:06.583 "memory_domains": [ 00:16:06.583 { 00:16:06.583 "dma_device_id": "system", 00:16:06.583 "dma_device_type": 1 00:16:06.583 }, 00:16:06.583 { 00:16:06.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.583 "dma_device_type": 2 00:16:06.583 } 00:16:06.583 ], 00:16:06.583 "driver_specific": {} 00:16:06.583 } 00:16:06.583 ] 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:06.841 [2024-07-12 13:41:55.391209] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:06.841 [2024-07-12 13:41:55.391252] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:06.841 [2024-07-12 13:41:55.391270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:06.841 [2024-07-12 13:41:55.392585] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.841 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.098 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.098 "name": "Existed_Raid", 00:16:07.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.098 "strip_size_kb": 0, 00:16:07.098 "state": "configuring", 00:16:07.098 "raid_level": "raid1", 00:16:07.098 "superblock": false, 00:16:07.098 "num_base_bdevs": 3, 00:16:07.098 "num_base_bdevs_discovered": 2, 00:16:07.098 "num_base_bdevs_operational": 3, 00:16:07.098 "base_bdevs_list": [ 00:16:07.098 { 00:16:07.098 "name": "BaseBdev1", 00:16:07.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.098 "is_configured": false, 00:16:07.098 "data_offset": 0, 00:16:07.098 "data_size": 0 00:16:07.098 }, 00:16:07.098 { 00:16:07.098 "name": "BaseBdev2", 00:16:07.098 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:07.098 "is_configured": true, 00:16:07.098 "data_offset": 0, 00:16:07.098 "data_size": 65536 00:16:07.098 }, 00:16:07.098 { 00:16:07.098 "name": "BaseBdev3", 00:16:07.098 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:07.098 "is_configured": true, 00:16:07.098 "data_offset": 0, 00:16:07.098 "data_size": 65536 00:16:07.098 } 00:16:07.098 ] 00:16:07.098 }' 00:16:07.098 13:41:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.098 13:41:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:08.031 [2024-07-12 13:41:56.490107] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.031 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.289 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.289 "name": "Existed_Raid", 00:16:08.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.289 "strip_size_kb": 0, 00:16:08.289 "state": "configuring", 00:16:08.289 "raid_level": "raid1", 00:16:08.289 "superblock": false, 00:16:08.289 "num_base_bdevs": 3, 00:16:08.289 "num_base_bdevs_discovered": 1, 00:16:08.289 "num_base_bdevs_operational": 3, 00:16:08.289 "base_bdevs_list": [ 00:16:08.289 { 00:16:08.289 "name": "BaseBdev1", 00:16:08.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.289 "is_configured": false, 00:16:08.289 "data_offset": 0, 00:16:08.289 "data_size": 0 00:16:08.289 }, 00:16:08.289 { 00:16:08.289 "name": null, 00:16:08.289 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:08.289 "is_configured": false, 00:16:08.289 "data_offset": 0, 00:16:08.289 "data_size": 65536 00:16:08.289 }, 00:16:08.289 { 00:16:08.289 "name": "BaseBdev3", 00:16:08.289 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:08.289 "is_configured": true, 00:16:08.289 "data_offset": 0, 00:16:08.289 "data_size": 65536 00:16:08.289 } 00:16:08.289 ] 00:16:08.289 }' 00:16:08.289 13:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.289 13:41:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:08.855 13:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:08.855 13:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.113 13:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:09.113 13:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:09.114 [2024-07-12 13:41:57.676725] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:09.114 BaseBdev1 00:16:09.372 13:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:09.372 13:41:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:09.372 13:41:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:09.372 13:41:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:09.372 13:41:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:09.372 13:41:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:09.372 13:41:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:09.372 13:41:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:09.938 [ 00:16:09.938 { 00:16:09.938 "name": "BaseBdev1", 00:16:09.938 "aliases": [ 00:16:09.938 "833c7e9e-1b11-45d6-86f7-d00df344d337" 00:16:09.938 ], 00:16:09.938 "product_name": "Malloc disk", 00:16:09.938 "block_size": 512, 00:16:09.938 "num_blocks": 65536, 00:16:09.938 "uuid": "833c7e9e-1b11-45d6-86f7-d00df344d337", 00:16:09.938 "assigned_rate_limits": { 00:16:09.938 "rw_ios_per_sec": 0, 00:16:09.938 "rw_mbytes_per_sec": 0, 00:16:09.938 "r_mbytes_per_sec": 0, 00:16:09.938 "w_mbytes_per_sec": 0 00:16:09.938 }, 00:16:09.938 "claimed": true, 00:16:09.938 "claim_type": "exclusive_write", 00:16:09.938 "zoned": false, 00:16:09.938 "supported_io_types": { 00:16:09.938 "read": true, 00:16:09.938 "write": true, 00:16:09.938 "unmap": true, 00:16:09.938 "flush": true, 00:16:09.938 "reset": true, 00:16:09.938 "nvme_admin": false, 00:16:09.938 "nvme_io": false, 00:16:09.938 "nvme_io_md": false, 00:16:09.938 "write_zeroes": true, 00:16:09.938 "zcopy": true, 00:16:09.938 "get_zone_info": false, 00:16:09.938 "zone_management": false, 00:16:09.938 "zone_append": false, 00:16:09.938 "compare": false, 00:16:09.938 "compare_and_write": false, 00:16:09.938 "abort": true, 00:16:09.938 "seek_hole": false, 00:16:09.938 "seek_data": false, 00:16:09.938 "copy": true, 00:16:09.938 "nvme_iov_md": false 00:16:09.938 }, 00:16:09.938 "memory_domains": [ 00:16:09.938 { 00:16:09.938 "dma_device_id": "system", 00:16:09.938 "dma_device_type": 1 00:16:09.938 }, 00:16:09.938 { 00:16:09.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.938 "dma_device_type": 2 00:16:09.938 } 00:16:09.938 ], 00:16:09.938 "driver_specific": {} 00:16:09.938 } 00:16:09.938 ] 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.938 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:10.196 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.196 "name": "Existed_Raid", 00:16:10.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:10.196 "strip_size_kb": 0, 00:16:10.196 "state": "configuring", 00:16:10.196 "raid_level": "raid1", 00:16:10.196 "superblock": false, 00:16:10.196 "num_base_bdevs": 3, 00:16:10.196 "num_base_bdevs_discovered": 2, 00:16:10.196 "num_base_bdevs_operational": 3, 00:16:10.196 "base_bdevs_list": [ 00:16:10.196 { 00:16:10.196 "name": "BaseBdev1", 00:16:10.196 "uuid": "833c7e9e-1b11-45d6-86f7-d00df344d337", 00:16:10.196 "is_configured": true, 00:16:10.196 "data_offset": 0, 00:16:10.196 "data_size": 65536 00:16:10.196 }, 00:16:10.196 { 00:16:10.196 "name": null, 00:16:10.196 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:10.196 "is_configured": false, 00:16:10.196 "data_offset": 0, 00:16:10.196 "data_size": 65536 00:16:10.196 }, 00:16:10.196 { 00:16:10.196 "name": "BaseBdev3", 00:16:10.196 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:10.196 "is_configured": true, 00:16:10.196 "data_offset": 0, 00:16:10.196 "data_size": 65536 00:16:10.196 } 00:16:10.196 ] 00:16:10.196 }' 00:16:10.196 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.196 13:41:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:10.762 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.762 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:11.020 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:11.020 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:11.020 [2024-07-12 13:41:59.577836] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:11.279 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:11.279 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:11.279 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:11.280 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:11.280 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:11.280 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:11.280 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:11.280 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:11.280 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:11.280 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:11.280 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.280 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.538 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.538 "name": "Existed_Raid", 00:16:11.538 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:11.538 "strip_size_kb": 0, 00:16:11.538 "state": "configuring", 00:16:11.538 "raid_level": "raid1", 00:16:11.538 "superblock": false, 00:16:11.538 "num_base_bdevs": 3, 00:16:11.538 "num_base_bdevs_discovered": 1, 00:16:11.538 "num_base_bdevs_operational": 3, 00:16:11.538 "base_bdevs_list": [ 00:16:11.538 { 00:16:11.538 "name": "BaseBdev1", 00:16:11.538 "uuid": "833c7e9e-1b11-45d6-86f7-d00df344d337", 00:16:11.538 "is_configured": true, 00:16:11.538 "data_offset": 0, 00:16:11.538 "data_size": 65536 00:16:11.538 }, 00:16:11.538 { 00:16:11.538 "name": null, 00:16:11.538 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:11.538 "is_configured": false, 00:16:11.538 "data_offset": 0, 00:16:11.538 "data_size": 65536 00:16:11.538 }, 00:16:11.538 { 00:16:11.538 "name": null, 00:16:11.538 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:11.538 "is_configured": false, 00:16:11.538 "data_offset": 0, 00:16:11.538 "data_size": 65536 00:16:11.538 } 00:16:11.538 ] 00:16:11.538 }' 00:16:11.538 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.538 13:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:12.104 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.104 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:12.362 [2024-07-12 13:42:00.897349] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.362 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.620 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.620 "name": "Existed_Raid", 00:16:12.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.620 "strip_size_kb": 0, 00:16:12.620 "state": "configuring", 00:16:12.620 "raid_level": "raid1", 00:16:12.620 "superblock": false, 00:16:12.620 "num_base_bdevs": 3, 00:16:12.620 "num_base_bdevs_discovered": 2, 00:16:12.620 "num_base_bdevs_operational": 3, 00:16:12.620 "base_bdevs_list": [ 00:16:12.620 { 00:16:12.620 "name": "BaseBdev1", 00:16:12.620 "uuid": "833c7e9e-1b11-45d6-86f7-d00df344d337", 00:16:12.620 "is_configured": true, 00:16:12.620 "data_offset": 0, 00:16:12.620 "data_size": 65536 00:16:12.620 }, 00:16:12.620 { 00:16:12.620 "name": null, 00:16:12.620 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:12.620 "is_configured": false, 00:16:12.620 "data_offset": 0, 00:16:12.620 "data_size": 65536 00:16:12.620 }, 00:16:12.620 { 00:16:12.620 "name": "BaseBdev3", 00:16:12.620 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:12.620 "is_configured": true, 00:16:12.620 "data_offset": 0, 00:16:12.620 "data_size": 65536 00:16:12.620 } 00:16:12.620 ] 00:16:12.620 }' 00:16:12.621 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.621 13:42:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.555 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.555 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:13.555 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:13.555 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:14.125 [2024-07-12 13:42:02.533736] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.125 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:14.384 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.384 "name": "Existed_Raid", 00:16:14.384 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:14.384 "strip_size_kb": 0, 00:16:14.384 "state": "configuring", 00:16:14.384 "raid_level": "raid1", 00:16:14.384 "superblock": false, 00:16:14.384 "num_base_bdevs": 3, 00:16:14.384 "num_base_bdevs_discovered": 1, 00:16:14.384 "num_base_bdevs_operational": 3, 00:16:14.384 "base_bdevs_list": [ 00:16:14.384 { 00:16:14.384 "name": null, 00:16:14.384 "uuid": "833c7e9e-1b11-45d6-86f7-d00df344d337", 00:16:14.384 "is_configured": false, 00:16:14.384 "data_offset": 0, 00:16:14.384 "data_size": 65536 00:16:14.384 }, 00:16:14.384 { 00:16:14.384 "name": null, 00:16:14.384 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:14.384 "is_configured": false, 00:16:14.384 "data_offset": 0, 00:16:14.384 "data_size": 65536 00:16:14.384 }, 00:16:14.384 { 00:16:14.384 "name": "BaseBdev3", 00:16:14.384 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:14.384 "is_configured": true, 00:16:14.384 "data_offset": 0, 00:16:14.384 "data_size": 65536 00:16:14.384 } 00:16:14.384 ] 00:16:14.384 }' 00:16:14.384 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.384 13:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.949 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.949 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:15.208 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:15.208 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:15.775 [2024-07-12 13:42:04.166360] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.775 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.033 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.033 "name": "Existed_Raid", 00:16:16.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:16.033 "strip_size_kb": 0, 00:16:16.033 "state": "configuring", 00:16:16.033 "raid_level": "raid1", 00:16:16.033 "superblock": false, 00:16:16.033 "num_base_bdevs": 3, 00:16:16.033 "num_base_bdevs_discovered": 2, 00:16:16.033 "num_base_bdevs_operational": 3, 00:16:16.033 "base_bdevs_list": [ 00:16:16.033 { 00:16:16.033 "name": null, 00:16:16.034 "uuid": "833c7e9e-1b11-45d6-86f7-d00df344d337", 00:16:16.034 "is_configured": false, 00:16:16.034 "data_offset": 0, 00:16:16.034 "data_size": 65536 00:16:16.034 }, 00:16:16.034 { 00:16:16.034 "name": "BaseBdev2", 00:16:16.034 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:16.034 "is_configured": true, 00:16:16.034 "data_offset": 0, 00:16:16.034 "data_size": 65536 00:16:16.034 }, 00:16:16.034 { 00:16:16.034 "name": "BaseBdev3", 00:16:16.034 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:16.034 "is_configured": true, 00:16:16.034 "data_offset": 0, 00:16:16.034 "data_size": 65536 00:16:16.034 } 00:16:16.034 ] 00:16:16.034 }' 00:16:16.034 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.034 13:42:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.600 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:16.600 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.859 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:16.859 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.859 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:17.118 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 833c7e9e-1b11-45d6-86f7-d00df344d337 00:16:17.376 [2024-07-12 13:42:05.761956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:17.377 [2024-07-12 13:42:05.761995] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17092b0 00:16:17.377 [2024-07-12 13:42:05.762004] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:17.377 [2024-07-12 13:42:05.762206] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17077e0 00:16:17.377 [2024-07-12 13:42:05.762332] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17092b0 00:16:17.377 [2024-07-12 13:42:05.762342] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17092b0 00:16:17.377 [2024-07-12 13:42:05.762505] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:17.377 NewBaseBdev 00:16:17.377 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:17.377 13:42:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:17.377 13:42:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:17.377 13:42:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:17.377 13:42:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:17.377 13:42:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:17.377 13:42:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:17.636 13:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:17.894 [ 00:16:17.894 { 00:16:17.894 "name": "NewBaseBdev", 00:16:17.894 "aliases": [ 00:16:17.894 "833c7e9e-1b11-45d6-86f7-d00df344d337" 00:16:17.894 ], 00:16:17.894 "product_name": "Malloc disk", 00:16:17.894 "block_size": 512, 00:16:17.894 "num_blocks": 65536, 00:16:17.894 "uuid": "833c7e9e-1b11-45d6-86f7-d00df344d337", 00:16:17.894 "assigned_rate_limits": { 00:16:17.894 "rw_ios_per_sec": 0, 00:16:17.894 "rw_mbytes_per_sec": 0, 00:16:17.894 "r_mbytes_per_sec": 0, 00:16:17.894 "w_mbytes_per_sec": 0 00:16:17.894 }, 00:16:17.894 "claimed": true, 00:16:17.894 "claim_type": "exclusive_write", 00:16:17.894 "zoned": false, 00:16:17.894 "supported_io_types": { 00:16:17.894 "read": true, 00:16:17.894 "write": true, 00:16:17.894 "unmap": true, 00:16:17.894 "flush": true, 00:16:17.894 "reset": true, 00:16:17.894 "nvme_admin": false, 00:16:17.894 "nvme_io": false, 00:16:17.894 "nvme_io_md": false, 00:16:17.894 "write_zeroes": true, 00:16:17.894 "zcopy": true, 00:16:17.894 "get_zone_info": false, 00:16:17.894 "zone_management": false, 00:16:17.894 "zone_append": false, 00:16:17.894 "compare": false, 00:16:17.894 "compare_and_write": false, 00:16:17.894 "abort": true, 00:16:17.894 "seek_hole": false, 00:16:17.894 "seek_data": false, 00:16:17.894 "copy": true, 00:16:17.894 "nvme_iov_md": false 00:16:17.894 }, 00:16:17.894 "memory_domains": [ 00:16:17.894 { 00:16:17.894 "dma_device_id": "system", 00:16:17.894 "dma_device_type": 1 00:16:17.894 }, 00:16:17.894 { 00:16:17.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.894 "dma_device_type": 2 00:16:17.894 } 00:16:17.894 ], 00:16:17.894 "driver_specific": {} 00:16:17.894 } 00:16:17.894 ] 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.894 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:18.152 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.152 "name": "Existed_Raid", 00:16:18.152 "uuid": "5cd40780-c39f-48ea-b9b7-5dff4bbc0dee", 00:16:18.152 "strip_size_kb": 0, 00:16:18.152 "state": "online", 00:16:18.152 "raid_level": "raid1", 00:16:18.152 "superblock": false, 00:16:18.152 "num_base_bdevs": 3, 00:16:18.152 "num_base_bdevs_discovered": 3, 00:16:18.152 "num_base_bdevs_operational": 3, 00:16:18.152 "base_bdevs_list": [ 00:16:18.152 { 00:16:18.152 "name": "NewBaseBdev", 00:16:18.152 "uuid": "833c7e9e-1b11-45d6-86f7-d00df344d337", 00:16:18.153 "is_configured": true, 00:16:18.153 "data_offset": 0, 00:16:18.153 "data_size": 65536 00:16:18.153 }, 00:16:18.153 { 00:16:18.153 "name": "BaseBdev2", 00:16:18.153 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:18.153 "is_configured": true, 00:16:18.153 "data_offset": 0, 00:16:18.153 "data_size": 65536 00:16:18.153 }, 00:16:18.153 { 00:16:18.153 "name": "BaseBdev3", 00:16:18.153 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:18.153 "is_configured": true, 00:16:18.153 "data_offset": 0, 00:16:18.153 "data_size": 65536 00:16:18.153 } 00:16:18.153 ] 00:16:18.153 }' 00:16:18.153 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.153 13:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:18.718 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:18.718 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:18.718 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:18.718 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:18.718 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:18.718 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:18.718 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:18.718 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:18.976 [2024-07-12 13:42:07.334420] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:18.976 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:18.976 "name": "Existed_Raid", 00:16:18.976 "aliases": [ 00:16:18.976 "5cd40780-c39f-48ea-b9b7-5dff4bbc0dee" 00:16:18.976 ], 00:16:18.976 "product_name": "Raid Volume", 00:16:18.976 "block_size": 512, 00:16:18.976 "num_blocks": 65536, 00:16:18.976 "uuid": "5cd40780-c39f-48ea-b9b7-5dff4bbc0dee", 00:16:18.976 "assigned_rate_limits": { 00:16:18.976 "rw_ios_per_sec": 0, 00:16:18.976 "rw_mbytes_per_sec": 0, 00:16:18.976 "r_mbytes_per_sec": 0, 00:16:18.976 "w_mbytes_per_sec": 0 00:16:18.976 }, 00:16:18.976 "claimed": false, 00:16:18.976 "zoned": false, 00:16:18.976 "supported_io_types": { 00:16:18.976 "read": true, 00:16:18.976 "write": true, 00:16:18.976 "unmap": false, 00:16:18.976 "flush": false, 00:16:18.976 "reset": true, 00:16:18.976 "nvme_admin": false, 00:16:18.976 "nvme_io": false, 00:16:18.976 "nvme_io_md": false, 00:16:18.976 "write_zeroes": true, 00:16:18.976 "zcopy": false, 00:16:18.976 "get_zone_info": false, 00:16:18.976 "zone_management": false, 00:16:18.976 "zone_append": false, 00:16:18.976 "compare": false, 00:16:18.976 "compare_and_write": false, 00:16:18.976 "abort": false, 00:16:18.976 "seek_hole": false, 00:16:18.976 "seek_data": false, 00:16:18.976 "copy": false, 00:16:18.976 "nvme_iov_md": false 00:16:18.976 }, 00:16:18.976 "memory_domains": [ 00:16:18.976 { 00:16:18.976 "dma_device_id": "system", 00:16:18.976 "dma_device_type": 1 00:16:18.976 }, 00:16:18.976 { 00:16:18.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.976 "dma_device_type": 2 00:16:18.976 }, 00:16:18.976 { 00:16:18.976 "dma_device_id": "system", 00:16:18.976 "dma_device_type": 1 00:16:18.976 }, 00:16:18.976 { 00:16:18.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.976 "dma_device_type": 2 00:16:18.976 }, 00:16:18.976 { 00:16:18.976 "dma_device_id": "system", 00:16:18.976 "dma_device_type": 1 00:16:18.976 }, 00:16:18.976 { 00:16:18.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.976 "dma_device_type": 2 00:16:18.976 } 00:16:18.976 ], 00:16:18.976 "driver_specific": { 00:16:18.976 "raid": { 00:16:18.976 "uuid": "5cd40780-c39f-48ea-b9b7-5dff4bbc0dee", 00:16:18.976 "strip_size_kb": 0, 00:16:18.976 "state": "online", 00:16:18.976 "raid_level": "raid1", 00:16:18.976 "superblock": false, 00:16:18.976 "num_base_bdevs": 3, 00:16:18.976 "num_base_bdevs_discovered": 3, 00:16:18.976 "num_base_bdevs_operational": 3, 00:16:18.976 "base_bdevs_list": [ 00:16:18.976 { 00:16:18.976 "name": "NewBaseBdev", 00:16:18.976 "uuid": "833c7e9e-1b11-45d6-86f7-d00df344d337", 00:16:18.976 "is_configured": true, 00:16:18.976 "data_offset": 0, 00:16:18.976 "data_size": 65536 00:16:18.976 }, 00:16:18.976 { 00:16:18.976 "name": "BaseBdev2", 00:16:18.976 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:18.976 "is_configured": true, 00:16:18.976 "data_offset": 0, 00:16:18.976 "data_size": 65536 00:16:18.976 }, 00:16:18.976 { 00:16:18.976 "name": "BaseBdev3", 00:16:18.976 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:18.976 "is_configured": true, 00:16:18.976 "data_offset": 0, 00:16:18.976 "data_size": 65536 00:16:18.976 } 00:16:18.976 ] 00:16:18.976 } 00:16:18.976 } 00:16:18.976 }' 00:16:18.976 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:18.976 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:18.976 BaseBdev2 00:16:18.976 BaseBdev3' 00:16:18.976 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:18.976 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:18.976 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:19.234 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:19.234 "name": "NewBaseBdev", 00:16:19.234 "aliases": [ 00:16:19.234 "833c7e9e-1b11-45d6-86f7-d00df344d337" 00:16:19.234 ], 00:16:19.234 "product_name": "Malloc disk", 00:16:19.234 "block_size": 512, 00:16:19.234 "num_blocks": 65536, 00:16:19.234 "uuid": "833c7e9e-1b11-45d6-86f7-d00df344d337", 00:16:19.234 "assigned_rate_limits": { 00:16:19.234 "rw_ios_per_sec": 0, 00:16:19.234 "rw_mbytes_per_sec": 0, 00:16:19.234 "r_mbytes_per_sec": 0, 00:16:19.234 "w_mbytes_per_sec": 0 00:16:19.234 }, 00:16:19.234 "claimed": true, 00:16:19.234 "claim_type": "exclusive_write", 00:16:19.234 "zoned": false, 00:16:19.234 "supported_io_types": { 00:16:19.234 "read": true, 00:16:19.234 "write": true, 00:16:19.234 "unmap": true, 00:16:19.234 "flush": true, 00:16:19.234 "reset": true, 00:16:19.234 "nvme_admin": false, 00:16:19.234 "nvme_io": false, 00:16:19.234 "nvme_io_md": false, 00:16:19.234 "write_zeroes": true, 00:16:19.234 "zcopy": true, 00:16:19.234 "get_zone_info": false, 00:16:19.234 "zone_management": false, 00:16:19.234 "zone_append": false, 00:16:19.234 "compare": false, 00:16:19.234 "compare_and_write": false, 00:16:19.234 "abort": true, 00:16:19.234 "seek_hole": false, 00:16:19.234 "seek_data": false, 00:16:19.234 "copy": true, 00:16:19.234 "nvme_iov_md": false 00:16:19.234 }, 00:16:19.234 "memory_domains": [ 00:16:19.234 { 00:16:19.234 "dma_device_id": "system", 00:16:19.234 "dma_device_type": 1 00:16:19.234 }, 00:16:19.234 { 00:16:19.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.234 "dma_device_type": 2 00:16:19.234 } 00:16:19.234 ], 00:16:19.234 "driver_specific": {} 00:16:19.234 }' 00:16:19.234 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:19.234 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:19.234 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:19.234 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:19.234 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:19.535 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:19.535 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:19.535 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:19.535 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:19.535 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:19.535 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:19.535 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:19.535 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:19.535 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:19.535 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:19.793 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:19.793 "name": "BaseBdev2", 00:16:19.793 "aliases": [ 00:16:19.793 "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06" 00:16:19.793 ], 00:16:19.793 "product_name": "Malloc disk", 00:16:19.793 "block_size": 512, 00:16:19.793 "num_blocks": 65536, 00:16:19.793 "uuid": "9ec86680-237b-4ed5-b8ae-bf48a4e6cc06", 00:16:19.793 "assigned_rate_limits": { 00:16:19.793 "rw_ios_per_sec": 0, 00:16:19.793 "rw_mbytes_per_sec": 0, 00:16:19.793 "r_mbytes_per_sec": 0, 00:16:19.793 "w_mbytes_per_sec": 0 00:16:19.793 }, 00:16:19.793 "claimed": true, 00:16:19.793 "claim_type": "exclusive_write", 00:16:19.793 "zoned": false, 00:16:19.793 "supported_io_types": { 00:16:19.793 "read": true, 00:16:19.793 "write": true, 00:16:19.793 "unmap": true, 00:16:19.793 "flush": true, 00:16:19.793 "reset": true, 00:16:19.793 "nvme_admin": false, 00:16:19.793 "nvme_io": false, 00:16:19.793 "nvme_io_md": false, 00:16:19.793 "write_zeroes": true, 00:16:19.793 "zcopy": true, 00:16:19.793 "get_zone_info": false, 00:16:19.793 "zone_management": false, 00:16:19.793 "zone_append": false, 00:16:19.793 "compare": false, 00:16:19.793 "compare_and_write": false, 00:16:19.793 "abort": true, 00:16:19.793 "seek_hole": false, 00:16:19.793 "seek_data": false, 00:16:19.793 "copy": true, 00:16:19.793 "nvme_iov_md": false 00:16:19.793 }, 00:16:19.793 "memory_domains": [ 00:16:19.793 { 00:16:19.793 "dma_device_id": "system", 00:16:19.793 "dma_device_type": 1 00:16:19.793 }, 00:16:19.793 { 00:16:19.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.793 "dma_device_type": 2 00:16:19.793 } 00:16:19.793 ], 00:16:19.793 "driver_specific": {} 00:16:19.793 }' 00:16:19.793 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:19.793 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:19.793 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:19.793 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:20.051 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:20.308 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:20.308 "name": "BaseBdev3", 00:16:20.308 "aliases": [ 00:16:20.308 "1edd294a-c949-47e8-8c64-a86ca019617e" 00:16:20.308 ], 00:16:20.308 "product_name": "Malloc disk", 00:16:20.308 "block_size": 512, 00:16:20.308 "num_blocks": 65536, 00:16:20.308 "uuid": "1edd294a-c949-47e8-8c64-a86ca019617e", 00:16:20.308 "assigned_rate_limits": { 00:16:20.308 "rw_ios_per_sec": 0, 00:16:20.308 "rw_mbytes_per_sec": 0, 00:16:20.308 "r_mbytes_per_sec": 0, 00:16:20.308 "w_mbytes_per_sec": 0 00:16:20.308 }, 00:16:20.308 "claimed": true, 00:16:20.308 "claim_type": "exclusive_write", 00:16:20.308 "zoned": false, 00:16:20.308 "supported_io_types": { 00:16:20.308 "read": true, 00:16:20.308 "write": true, 00:16:20.308 "unmap": true, 00:16:20.308 "flush": true, 00:16:20.308 "reset": true, 00:16:20.308 "nvme_admin": false, 00:16:20.308 "nvme_io": false, 00:16:20.308 "nvme_io_md": false, 00:16:20.309 "write_zeroes": true, 00:16:20.309 "zcopy": true, 00:16:20.309 "get_zone_info": false, 00:16:20.309 "zone_management": false, 00:16:20.309 "zone_append": false, 00:16:20.309 "compare": false, 00:16:20.309 "compare_and_write": false, 00:16:20.309 "abort": true, 00:16:20.309 "seek_hole": false, 00:16:20.309 "seek_data": false, 00:16:20.309 "copy": true, 00:16:20.309 "nvme_iov_md": false 00:16:20.309 }, 00:16:20.309 "memory_domains": [ 00:16:20.309 { 00:16:20.309 "dma_device_id": "system", 00:16:20.309 "dma_device_type": 1 00:16:20.309 }, 00:16:20.309 { 00:16:20.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.309 "dma_device_type": 2 00:16:20.309 } 00:16:20.309 ], 00:16:20.309 "driver_specific": {} 00:16:20.309 }' 00:16:20.309 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:20.566 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:20.566 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:20.566 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:20.566 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:20.566 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:20.566 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:20.566 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:20.566 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:20.566 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:20.823 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:20.823 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:20.823 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:21.081 [2024-07-12 13:42:09.427672] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:21.081 [2024-07-12 13:42:09.427704] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:21.081 [2024-07-12 13:42:09.427756] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:21.081 [2024-07-12 13:42:09.428025] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:21.081 [2024-07-12 13:42:09.428039] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17092b0 name Existed_Raid, state offline 00:16:21.081 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 474606 00:16:21.082 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 474606 ']' 00:16:21.082 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 474606 00:16:21.082 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:21.082 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:21.082 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 474606 00:16:21.082 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:21.082 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:21.082 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 474606' 00:16:21.082 killing process with pid 474606 00:16:21.082 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 474606 00:16:21.082 [2024-07-12 13:42:09.501536] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:21.082 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 474606 00:16:21.082 [2024-07-12 13:42:09.528823] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:21.341 00:16:21.341 real 0m29.288s 00:16:21.341 user 0m53.644s 00:16:21.341 sys 0m5.280s 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.341 ************************************ 00:16:21.341 END TEST raid_state_function_test 00:16:21.341 ************************************ 00:16:21.341 13:42:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:21.341 13:42:09 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:16:21.341 13:42:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:21.341 13:42:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:21.341 13:42:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:21.341 ************************************ 00:16:21.341 START TEST raid_state_function_test_sb 00:16:21.341 ************************************ 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:21.341 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=479409 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 479409' 00:16:21.342 Process raid pid: 479409 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 479409 /var/tmp/spdk-raid.sock 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 479409 ']' 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:21.342 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:21.342 13:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:21.342 [2024-07-12 13:42:09.917121] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:16:21.342 [2024-07-12 13:42:09.917199] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:21.600 [2024-07-12 13:42:10.051221] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:21.600 [2024-07-12 13:42:10.151528] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:21.859 [2024-07-12 13:42:10.219414] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:21.859 [2024-07-12 13:42:10.219451] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:22.425 13:42:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:22.425 13:42:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:22.425 13:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:22.684 [2024-07-12 13:42:11.018513] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:22.684 [2024-07-12 13:42:11.018558] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:22.684 [2024-07-12 13:42:11.018574] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:22.684 [2024-07-12 13:42:11.018587] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:22.684 [2024-07-12 13:42:11.018596] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:22.684 [2024-07-12 13:42:11.018607] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.684 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:22.944 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:22.944 "name": "Existed_Raid", 00:16:22.944 "uuid": "a603a660-786c-4737-8237-afec42b8eb71", 00:16:22.944 "strip_size_kb": 0, 00:16:22.944 "state": "configuring", 00:16:22.944 "raid_level": "raid1", 00:16:22.944 "superblock": true, 00:16:22.944 "num_base_bdevs": 3, 00:16:22.944 "num_base_bdevs_discovered": 0, 00:16:22.944 "num_base_bdevs_operational": 3, 00:16:22.944 "base_bdevs_list": [ 00:16:22.944 { 00:16:22.944 "name": "BaseBdev1", 00:16:22.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:22.944 "is_configured": false, 00:16:22.944 "data_offset": 0, 00:16:22.944 "data_size": 0 00:16:22.944 }, 00:16:22.944 { 00:16:22.944 "name": "BaseBdev2", 00:16:22.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:22.944 "is_configured": false, 00:16:22.944 "data_offset": 0, 00:16:22.944 "data_size": 0 00:16:22.944 }, 00:16:22.944 { 00:16:22.944 "name": "BaseBdev3", 00:16:22.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:22.944 "is_configured": false, 00:16:22.944 "data_offset": 0, 00:16:22.944 "data_size": 0 00:16:22.944 } 00:16:22.944 ] 00:16:22.944 }' 00:16:22.944 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:22.944 13:42:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:23.511 13:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:23.511 [2024-07-12 13:42:12.077183] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:23.512 [2024-07-12 13:42:12.077215] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb18350 name Existed_Raid, state configuring 00:16:23.770 13:42:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:23.770 [2024-07-12 13:42:12.325847] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:23.770 [2024-07-12 13:42:12.325875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:23.770 [2024-07-12 13:42:12.325885] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:23.770 [2024-07-12 13:42:12.325896] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:23.770 [2024-07-12 13:42:12.325910] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:23.770 [2024-07-12 13:42:12.325921] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:23.770 13:42:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:24.030 [2024-07-12 13:42:12.580420] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:24.030 BaseBdev1 00:16:24.030 13:42:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:24.030 13:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:24.030 13:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:24.030 13:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:24.030 13:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:24.030 13:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:24.030 13:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:24.289 13:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:24.548 [ 00:16:24.548 { 00:16:24.548 "name": "BaseBdev1", 00:16:24.548 "aliases": [ 00:16:24.548 "06282fc4-9ab1-43ab-b976-d0bc74ff4637" 00:16:24.548 ], 00:16:24.548 "product_name": "Malloc disk", 00:16:24.548 "block_size": 512, 00:16:24.548 "num_blocks": 65536, 00:16:24.548 "uuid": "06282fc4-9ab1-43ab-b976-d0bc74ff4637", 00:16:24.548 "assigned_rate_limits": { 00:16:24.548 "rw_ios_per_sec": 0, 00:16:24.548 "rw_mbytes_per_sec": 0, 00:16:24.548 "r_mbytes_per_sec": 0, 00:16:24.548 "w_mbytes_per_sec": 0 00:16:24.548 }, 00:16:24.548 "claimed": true, 00:16:24.548 "claim_type": "exclusive_write", 00:16:24.548 "zoned": false, 00:16:24.548 "supported_io_types": { 00:16:24.548 "read": true, 00:16:24.548 "write": true, 00:16:24.548 "unmap": true, 00:16:24.548 "flush": true, 00:16:24.548 "reset": true, 00:16:24.548 "nvme_admin": false, 00:16:24.548 "nvme_io": false, 00:16:24.548 "nvme_io_md": false, 00:16:24.548 "write_zeroes": true, 00:16:24.548 "zcopy": true, 00:16:24.548 "get_zone_info": false, 00:16:24.548 "zone_management": false, 00:16:24.548 "zone_append": false, 00:16:24.548 "compare": false, 00:16:24.548 "compare_and_write": false, 00:16:24.548 "abort": true, 00:16:24.548 "seek_hole": false, 00:16:24.548 "seek_data": false, 00:16:24.548 "copy": true, 00:16:24.548 "nvme_iov_md": false 00:16:24.548 }, 00:16:24.548 "memory_domains": [ 00:16:24.548 { 00:16:24.548 "dma_device_id": "system", 00:16:24.548 "dma_device_type": 1 00:16:24.548 }, 00:16:24.548 { 00:16:24.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.548 "dma_device_type": 2 00:16:24.548 } 00:16:24.548 ], 00:16:24.548 "driver_specific": {} 00:16:24.548 } 00:16:24.548 ] 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.548 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:24.806 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.806 "name": "Existed_Raid", 00:16:24.806 "uuid": "d0d9fbd1-e891-4692-b0eb-668fb582920a", 00:16:24.806 "strip_size_kb": 0, 00:16:24.806 "state": "configuring", 00:16:24.806 "raid_level": "raid1", 00:16:24.806 "superblock": true, 00:16:24.806 "num_base_bdevs": 3, 00:16:24.806 "num_base_bdevs_discovered": 1, 00:16:24.806 "num_base_bdevs_operational": 3, 00:16:24.806 "base_bdevs_list": [ 00:16:24.806 { 00:16:24.806 "name": "BaseBdev1", 00:16:24.806 "uuid": "06282fc4-9ab1-43ab-b976-d0bc74ff4637", 00:16:24.806 "is_configured": true, 00:16:24.806 "data_offset": 2048, 00:16:24.806 "data_size": 63488 00:16:24.806 }, 00:16:24.806 { 00:16:24.806 "name": "BaseBdev2", 00:16:24.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:24.806 "is_configured": false, 00:16:24.806 "data_offset": 0, 00:16:24.806 "data_size": 0 00:16:24.806 }, 00:16:24.806 { 00:16:24.806 "name": "BaseBdev3", 00:16:24.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:24.806 "is_configured": false, 00:16:24.806 "data_offset": 0, 00:16:24.806 "data_size": 0 00:16:24.806 } 00:16:24.806 ] 00:16:24.806 }' 00:16:24.806 13:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.806 13:42:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:25.741 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:25.741 [2024-07-12 13:42:14.240828] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:25.741 [2024-07-12 13:42:14.240873] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb17c20 name Existed_Raid, state configuring 00:16:25.741 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:26.015 [2024-07-12 13:42:14.481502] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:26.015 [2024-07-12 13:42:14.483042] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:26.015 [2024-07-12 13:42:14.483077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:26.015 [2024-07-12 13:42:14.483087] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:26.015 [2024-07-12 13:42:14.483098] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.015 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:26.275 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.275 "name": "Existed_Raid", 00:16:26.275 "uuid": "7e70497b-2c3b-463c-96db-a855d2e38349", 00:16:26.275 "strip_size_kb": 0, 00:16:26.275 "state": "configuring", 00:16:26.275 "raid_level": "raid1", 00:16:26.275 "superblock": true, 00:16:26.275 "num_base_bdevs": 3, 00:16:26.275 "num_base_bdevs_discovered": 1, 00:16:26.275 "num_base_bdevs_operational": 3, 00:16:26.275 "base_bdevs_list": [ 00:16:26.275 { 00:16:26.275 "name": "BaseBdev1", 00:16:26.275 "uuid": "06282fc4-9ab1-43ab-b976-d0bc74ff4637", 00:16:26.275 "is_configured": true, 00:16:26.275 "data_offset": 2048, 00:16:26.275 "data_size": 63488 00:16:26.275 }, 00:16:26.275 { 00:16:26.275 "name": "BaseBdev2", 00:16:26.275 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.275 "is_configured": false, 00:16:26.275 "data_offset": 0, 00:16:26.275 "data_size": 0 00:16:26.275 }, 00:16:26.275 { 00:16:26.275 "name": "BaseBdev3", 00:16:26.275 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.275 "is_configured": false, 00:16:26.275 "data_offset": 0, 00:16:26.275 "data_size": 0 00:16:26.275 } 00:16:26.275 ] 00:16:26.275 }' 00:16:26.275 13:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.275 13:42:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:26.843 13:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:27.102 [2024-07-12 13:42:15.627977] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:27.102 BaseBdev2 00:16:27.102 13:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:27.103 13:42:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:27.103 13:42:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:27.103 13:42:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:27.103 13:42:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:27.103 13:42:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:27.103 13:42:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:27.362 13:42:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:27.621 [ 00:16:27.621 { 00:16:27.621 "name": "BaseBdev2", 00:16:27.621 "aliases": [ 00:16:27.621 "3f08b71f-0e02-4e3a-aa07-da314cb2b259" 00:16:27.621 ], 00:16:27.621 "product_name": "Malloc disk", 00:16:27.621 "block_size": 512, 00:16:27.621 "num_blocks": 65536, 00:16:27.621 "uuid": "3f08b71f-0e02-4e3a-aa07-da314cb2b259", 00:16:27.621 "assigned_rate_limits": { 00:16:27.621 "rw_ios_per_sec": 0, 00:16:27.621 "rw_mbytes_per_sec": 0, 00:16:27.621 "r_mbytes_per_sec": 0, 00:16:27.621 "w_mbytes_per_sec": 0 00:16:27.621 }, 00:16:27.621 "claimed": true, 00:16:27.621 "claim_type": "exclusive_write", 00:16:27.621 "zoned": false, 00:16:27.621 "supported_io_types": { 00:16:27.621 "read": true, 00:16:27.621 "write": true, 00:16:27.621 "unmap": true, 00:16:27.621 "flush": true, 00:16:27.621 "reset": true, 00:16:27.621 "nvme_admin": false, 00:16:27.621 "nvme_io": false, 00:16:27.621 "nvme_io_md": false, 00:16:27.621 "write_zeroes": true, 00:16:27.621 "zcopy": true, 00:16:27.621 "get_zone_info": false, 00:16:27.621 "zone_management": false, 00:16:27.621 "zone_append": false, 00:16:27.621 "compare": false, 00:16:27.621 "compare_and_write": false, 00:16:27.621 "abort": true, 00:16:27.621 "seek_hole": false, 00:16:27.621 "seek_data": false, 00:16:27.621 "copy": true, 00:16:27.621 "nvme_iov_md": false 00:16:27.621 }, 00:16:27.621 "memory_domains": [ 00:16:27.621 { 00:16:27.621 "dma_device_id": "system", 00:16:27.621 "dma_device_type": 1 00:16:27.621 }, 00:16:27.621 { 00:16:27.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.621 "dma_device_type": 2 00:16:27.621 } 00:16:27.621 ], 00:16:27.621 "driver_specific": {} 00:16:27.621 } 00:16:27.622 ] 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.622 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:27.881 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.881 "name": "Existed_Raid", 00:16:27.881 "uuid": "7e70497b-2c3b-463c-96db-a855d2e38349", 00:16:27.881 "strip_size_kb": 0, 00:16:27.881 "state": "configuring", 00:16:27.881 "raid_level": "raid1", 00:16:27.881 "superblock": true, 00:16:27.881 "num_base_bdevs": 3, 00:16:27.881 "num_base_bdevs_discovered": 2, 00:16:27.881 "num_base_bdevs_operational": 3, 00:16:27.881 "base_bdevs_list": [ 00:16:27.881 { 00:16:27.881 "name": "BaseBdev1", 00:16:27.881 "uuid": "06282fc4-9ab1-43ab-b976-d0bc74ff4637", 00:16:27.881 "is_configured": true, 00:16:27.881 "data_offset": 2048, 00:16:27.881 "data_size": 63488 00:16:27.881 }, 00:16:27.881 { 00:16:27.881 "name": "BaseBdev2", 00:16:27.881 "uuid": "3f08b71f-0e02-4e3a-aa07-da314cb2b259", 00:16:27.881 "is_configured": true, 00:16:27.881 "data_offset": 2048, 00:16:27.881 "data_size": 63488 00:16:27.881 }, 00:16:27.881 { 00:16:27.881 "name": "BaseBdev3", 00:16:27.881 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.881 "is_configured": false, 00:16:27.881 "data_offset": 0, 00:16:27.881 "data_size": 0 00:16:27.881 } 00:16:27.881 ] 00:16:27.881 }' 00:16:27.881 13:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.881 13:42:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:28.448 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:28.707 [2024-07-12 13:42:17.187502] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:28.707 [2024-07-12 13:42:17.187658] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb18b10 00:16:28.707 [2024-07-12 13:42:17.187672] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:28.707 [2024-07-12 13:42:17.187850] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb187e0 00:16:28.707 [2024-07-12 13:42:17.187991] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb18b10 00:16:28.707 [2024-07-12 13:42:17.188002] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb18b10 00:16:28.707 [2024-07-12 13:42:17.188099] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:28.707 BaseBdev3 00:16:28.707 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:28.707 13:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:28.707 13:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:28.707 13:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:28.707 13:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:28.707 13:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:28.707 13:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:28.966 13:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:29.224 [ 00:16:29.224 { 00:16:29.224 "name": "BaseBdev3", 00:16:29.224 "aliases": [ 00:16:29.224 "f7591c63-2278-46b6-ae18-52a4796ffb44" 00:16:29.224 ], 00:16:29.224 "product_name": "Malloc disk", 00:16:29.224 "block_size": 512, 00:16:29.224 "num_blocks": 65536, 00:16:29.224 "uuid": "f7591c63-2278-46b6-ae18-52a4796ffb44", 00:16:29.224 "assigned_rate_limits": { 00:16:29.224 "rw_ios_per_sec": 0, 00:16:29.224 "rw_mbytes_per_sec": 0, 00:16:29.224 "r_mbytes_per_sec": 0, 00:16:29.224 "w_mbytes_per_sec": 0 00:16:29.224 }, 00:16:29.224 "claimed": true, 00:16:29.224 "claim_type": "exclusive_write", 00:16:29.224 "zoned": false, 00:16:29.224 "supported_io_types": { 00:16:29.224 "read": true, 00:16:29.224 "write": true, 00:16:29.224 "unmap": true, 00:16:29.224 "flush": true, 00:16:29.224 "reset": true, 00:16:29.224 "nvme_admin": false, 00:16:29.224 "nvme_io": false, 00:16:29.224 "nvme_io_md": false, 00:16:29.224 "write_zeroes": true, 00:16:29.224 "zcopy": true, 00:16:29.224 "get_zone_info": false, 00:16:29.224 "zone_management": false, 00:16:29.224 "zone_append": false, 00:16:29.224 "compare": false, 00:16:29.224 "compare_and_write": false, 00:16:29.224 "abort": true, 00:16:29.224 "seek_hole": false, 00:16:29.224 "seek_data": false, 00:16:29.224 "copy": true, 00:16:29.224 "nvme_iov_md": false 00:16:29.224 }, 00:16:29.224 "memory_domains": [ 00:16:29.224 { 00:16:29.224 "dma_device_id": "system", 00:16:29.224 "dma_device_type": 1 00:16:29.224 }, 00:16:29.224 { 00:16:29.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.224 "dma_device_type": 2 00:16:29.224 } 00:16:29.224 ], 00:16:29.224 "driver_specific": {} 00:16:29.224 } 00:16:29.224 ] 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.224 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.483 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.483 "name": "Existed_Raid", 00:16:29.483 "uuid": "7e70497b-2c3b-463c-96db-a855d2e38349", 00:16:29.483 "strip_size_kb": 0, 00:16:29.483 "state": "online", 00:16:29.483 "raid_level": "raid1", 00:16:29.483 "superblock": true, 00:16:29.483 "num_base_bdevs": 3, 00:16:29.483 "num_base_bdevs_discovered": 3, 00:16:29.483 "num_base_bdevs_operational": 3, 00:16:29.483 "base_bdevs_list": [ 00:16:29.483 { 00:16:29.483 "name": "BaseBdev1", 00:16:29.483 "uuid": "06282fc4-9ab1-43ab-b976-d0bc74ff4637", 00:16:29.483 "is_configured": true, 00:16:29.483 "data_offset": 2048, 00:16:29.483 "data_size": 63488 00:16:29.483 }, 00:16:29.483 { 00:16:29.483 "name": "BaseBdev2", 00:16:29.483 "uuid": "3f08b71f-0e02-4e3a-aa07-da314cb2b259", 00:16:29.483 "is_configured": true, 00:16:29.483 "data_offset": 2048, 00:16:29.483 "data_size": 63488 00:16:29.483 }, 00:16:29.483 { 00:16:29.483 "name": "BaseBdev3", 00:16:29.483 "uuid": "f7591c63-2278-46b6-ae18-52a4796ffb44", 00:16:29.483 "is_configured": true, 00:16:29.483 "data_offset": 2048, 00:16:29.483 "data_size": 63488 00:16:29.483 } 00:16:29.483 ] 00:16:29.483 }' 00:16:29.483 13:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.483 13:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:30.050 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:30.050 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:30.050 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:30.050 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:30.050 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:30.050 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:30.050 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:30.050 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:30.309 [2024-07-12 13:42:18.772011] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:30.309 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:30.309 "name": "Existed_Raid", 00:16:30.309 "aliases": [ 00:16:30.309 "7e70497b-2c3b-463c-96db-a855d2e38349" 00:16:30.309 ], 00:16:30.309 "product_name": "Raid Volume", 00:16:30.309 "block_size": 512, 00:16:30.309 "num_blocks": 63488, 00:16:30.309 "uuid": "7e70497b-2c3b-463c-96db-a855d2e38349", 00:16:30.309 "assigned_rate_limits": { 00:16:30.309 "rw_ios_per_sec": 0, 00:16:30.309 "rw_mbytes_per_sec": 0, 00:16:30.309 "r_mbytes_per_sec": 0, 00:16:30.309 "w_mbytes_per_sec": 0 00:16:30.309 }, 00:16:30.309 "claimed": false, 00:16:30.309 "zoned": false, 00:16:30.309 "supported_io_types": { 00:16:30.309 "read": true, 00:16:30.309 "write": true, 00:16:30.309 "unmap": false, 00:16:30.309 "flush": false, 00:16:30.309 "reset": true, 00:16:30.309 "nvme_admin": false, 00:16:30.309 "nvme_io": false, 00:16:30.309 "nvme_io_md": false, 00:16:30.309 "write_zeroes": true, 00:16:30.309 "zcopy": false, 00:16:30.309 "get_zone_info": false, 00:16:30.309 "zone_management": false, 00:16:30.309 "zone_append": false, 00:16:30.309 "compare": false, 00:16:30.309 "compare_and_write": false, 00:16:30.309 "abort": false, 00:16:30.309 "seek_hole": false, 00:16:30.309 "seek_data": false, 00:16:30.309 "copy": false, 00:16:30.309 "nvme_iov_md": false 00:16:30.309 }, 00:16:30.309 "memory_domains": [ 00:16:30.309 { 00:16:30.309 "dma_device_id": "system", 00:16:30.309 "dma_device_type": 1 00:16:30.309 }, 00:16:30.309 { 00:16:30.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.309 "dma_device_type": 2 00:16:30.309 }, 00:16:30.309 { 00:16:30.309 "dma_device_id": "system", 00:16:30.309 "dma_device_type": 1 00:16:30.309 }, 00:16:30.309 { 00:16:30.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.309 "dma_device_type": 2 00:16:30.309 }, 00:16:30.309 { 00:16:30.309 "dma_device_id": "system", 00:16:30.309 "dma_device_type": 1 00:16:30.309 }, 00:16:30.309 { 00:16:30.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.309 "dma_device_type": 2 00:16:30.309 } 00:16:30.309 ], 00:16:30.309 "driver_specific": { 00:16:30.309 "raid": { 00:16:30.309 "uuid": "7e70497b-2c3b-463c-96db-a855d2e38349", 00:16:30.309 "strip_size_kb": 0, 00:16:30.309 "state": "online", 00:16:30.309 "raid_level": "raid1", 00:16:30.309 "superblock": true, 00:16:30.309 "num_base_bdevs": 3, 00:16:30.309 "num_base_bdevs_discovered": 3, 00:16:30.309 "num_base_bdevs_operational": 3, 00:16:30.309 "base_bdevs_list": [ 00:16:30.309 { 00:16:30.309 "name": "BaseBdev1", 00:16:30.309 "uuid": "06282fc4-9ab1-43ab-b976-d0bc74ff4637", 00:16:30.309 "is_configured": true, 00:16:30.309 "data_offset": 2048, 00:16:30.309 "data_size": 63488 00:16:30.309 }, 00:16:30.309 { 00:16:30.309 "name": "BaseBdev2", 00:16:30.309 "uuid": "3f08b71f-0e02-4e3a-aa07-da314cb2b259", 00:16:30.309 "is_configured": true, 00:16:30.309 "data_offset": 2048, 00:16:30.309 "data_size": 63488 00:16:30.309 }, 00:16:30.309 { 00:16:30.310 "name": "BaseBdev3", 00:16:30.310 "uuid": "f7591c63-2278-46b6-ae18-52a4796ffb44", 00:16:30.310 "is_configured": true, 00:16:30.310 "data_offset": 2048, 00:16:30.310 "data_size": 63488 00:16:30.310 } 00:16:30.310 ] 00:16:30.310 } 00:16:30.310 } 00:16:30.310 }' 00:16:30.310 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:30.310 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:30.310 BaseBdev2 00:16:30.310 BaseBdev3' 00:16:30.310 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:30.310 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:30.310 13:42:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:30.568 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:30.568 "name": "BaseBdev1", 00:16:30.568 "aliases": [ 00:16:30.568 "06282fc4-9ab1-43ab-b976-d0bc74ff4637" 00:16:30.568 ], 00:16:30.568 "product_name": "Malloc disk", 00:16:30.568 "block_size": 512, 00:16:30.568 "num_blocks": 65536, 00:16:30.568 "uuid": "06282fc4-9ab1-43ab-b976-d0bc74ff4637", 00:16:30.568 "assigned_rate_limits": { 00:16:30.568 "rw_ios_per_sec": 0, 00:16:30.568 "rw_mbytes_per_sec": 0, 00:16:30.568 "r_mbytes_per_sec": 0, 00:16:30.568 "w_mbytes_per_sec": 0 00:16:30.568 }, 00:16:30.568 "claimed": true, 00:16:30.568 "claim_type": "exclusive_write", 00:16:30.568 "zoned": false, 00:16:30.568 "supported_io_types": { 00:16:30.568 "read": true, 00:16:30.568 "write": true, 00:16:30.568 "unmap": true, 00:16:30.568 "flush": true, 00:16:30.568 "reset": true, 00:16:30.568 "nvme_admin": false, 00:16:30.568 "nvme_io": false, 00:16:30.568 "nvme_io_md": false, 00:16:30.568 "write_zeroes": true, 00:16:30.568 "zcopy": true, 00:16:30.568 "get_zone_info": false, 00:16:30.568 "zone_management": false, 00:16:30.568 "zone_append": false, 00:16:30.568 "compare": false, 00:16:30.568 "compare_and_write": false, 00:16:30.568 "abort": true, 00:16:30.568 "seek_hole": false, 00:16:30.568 "seek_data": false, 00:16:30.568 "copy": true, 00:16:30.568 "nvme_iov_md": false 00:16:30.568 }, 00:16:30.568 "memory_domains": [ 00:16:30.568 { 00:16:30.568 "dma_device_id": "system", 00:16:30.568 "dma_device_type": 1 00:16:30.568 }, 00:16:30.569 { 00:16:30.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.569 "dma_device_type": 2 00:16:30.569 } 00:16:30.569 ], 00:16:30.569 "driver_specific": {} 00:16:30.569 }' 00:16:30.569 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.569 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.827 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:30.827 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:30.827 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:30.827 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:30.827 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.827 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.827 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:30.827 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.827 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:31.085 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:31.085 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:31.085 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:31.085 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:31.344 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:31.344 "name": "BaseBdev2", 00:16:31.344 "aliases": [ 00:16:31.344 "3f08b71f-0e02-4e3a-aa07-da314cb2b259" 00:16:31.344 ], 00:16:31.344 "product_name": "Malloc disk", 00:16:31.344 "block_size": 512, 00:16:31.344 "num_blocks": 65536, 00:16:31.344 "uuid": "3f08b71f-0e02-4e3a-aa07-da314cb2b259", 00:16:31.344 "assigned_rate_limits": { 00:16:31.344 "rw_ios_per_sec": 0, 00:16:31.344 "rw_mbytes_per_sec": 0, 00:16:31.344 "r_mbytes_per_sec": 0, 00:16:31.344 "w_mbytes_per_sec": 0 00:16:31.344 }, 00:16:31.344 "claimed": true, 00:16:31.344 "claim_type": "exclusive_write", 00:16:31.344 "zoned": false, 00:16:31.344 "supported_io_types": { 00:16:31.344 "read": true, 00:16:31.344 "write": true, 00:16:31.344 "unmap": true, 00:16:31.344 "flush": true, 00:16:31.344 "reset": true, 00:16:31.344 "nvme_admin": false, 00:16:31.344 "nvme_io": false, 00:16:31.344 "nvme_io_md": false, 00:16:31.344 "write_zeroes": true, 00:16:31.344 "zcopy": true, 00:16:31.344 "get_zone_info": false, 00:16:31.344 "zone_management": false, 00:16:31.344 "zone_append": false, 00:16:31.344 "compare": false, 00:16:31.344 "compare_and_write": false, 00:16:31.344 "abort": true, 00:16:31.344 "seek_hole": false, 00:16:31.344 "seek_data": false, 00:16:31.344 "copy": true, 00:16:31.344 "nvme_iov_md": false 00:16:31.344 }, 00:16:31.344 "memory_domains": [ 00:16:31.344 { 00:16:31.344 "dma_device_id": "system", 00:16:31.344 "dma_device_type": 1 00:16:31.344 }, 00:16:31.344 { 00:16:31.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.344 "dma_device_type": 2 00:16:31.344 } 00:16:31.344 ], 00:16:31.344 "driver_specific": {} 00:16:31.344 }' 00:16:31.344 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:31.344 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:31.344 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:31.344 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:31.344 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:31.344 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:31.344 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:31.344 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:31.603 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:31.603 13:42:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:31.603 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:31.603 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:31.603 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:31.603 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:31.603 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:31.866 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:31.866 "name": "BaseBdev3", 00:16:31.866 "aliases": [ 00:16:31.866 "f7591c63-2278-46b6-ae18-52a4796ffb44" 00:16:31.866 ], 00:16:31.866 "product_name": "Malloc disk", 00:16:31.866 "block_size": 512, 00:16:31.866 "num_blocks": 65536, 00:16:31.866 "uuid": "f7591c63-2278-46b6-ae18-52a4796ffb44", 00:16:31.866 "assigned_rate_limits": { 00:16:31.866 "rw_ios_per_sec": 0, 00:16:31.866 "rw_mbytes_per_sec": 0, 00:16:31.866 "r_mbytes_per_sec": 0, 00:16:31.866 "w_mbytes_per_sec": 0 00:16:31.866 }, 00:16:31.866 "claimed": true, 00:16:31.866 "claim_type": "exclusive_write", 00:16:31.866 "zoned": false, 00:16:31.866 "supported_io_types": { 00:16:31.866 "read": true, 00:16:31.866 "write": true, 00:16:31.866 "unmap": true, 00:16:31.866 "flush": true, 00:16:31.866 "reset": true, 00:16:31.866 "nvme_admin": false, 00:16:31.866 "nvme_io": false, 00:16:31.866 "nvme_io_md": false, 00:16:31.866 "write_zeroes": true, 00:16:31.866 "zcopy": true, 00:16:31.866 "get_zone_info": false, 00:16:31.866 "zone_management": false, 00:16:31.866 "zone_append": false, 00:16:31.866 "compare": false, 00:16:31.866 "compare_and_write": false, 00:16:31.866 "abort": true, 00:16:31.866 "seek_hole": false, 00:16:31.866 "seek_data": false, 00:16:31.866 "copy": true, 00:16:31.866 "nvme_iov_md": false 00:16:31.866 }, 00:16:31.866 "memory_domains": [ 00:16:31.866 { 00:16:31.866 "dma_device_id": "system", 00:16:31.866 "dma_device_type": 1 00:16:31.866 }, 00:16:31.866 { 00:16:31.866 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.866 "dma_device_type": 2 00:16:31.866 } 00:16:31.866 ], 00:16:31.866 "driver_specific": {} 00:16:31.866 }' 00:16:31.866 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:31.866 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:31.866 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:31.866 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:31.866 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.194 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:32.194 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.194 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.194 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:32.194 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.194 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.194 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:32.194 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:32.489 [2024-07-12 13:42:20.889384] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:32.489 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:32.489 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:32.489 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:32.489 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:16:32.489 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:32.489 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:32.489 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:32.489 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:32.489 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:32.490 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:32.490 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:32.490 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.490 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.490 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.490 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.490 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.490 13:42:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.809 13:42:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.809 "name": "Existed_Raid", 00:16:32.809 "uuid": "7e70497b-2c3b-463c-96db-a855d2e38349", 00:16:32.809 "strip_size_kb": 0, 00:16:32.809 "state": "online", 00:16:32.809 "raid_level": "raid1", 00:16:32.809 "superblock": true, 00:16:32.809 "num_base_bdevs": 3, 00:16:32.809 "num_base_bdevs_discovered": 2, 00:16:32.809 "num_base_bdevs_operational": 2, 00:16:32.809 "base_bdevs_list": [ 00:16:32.809 { 00:16:32.809 "name": null, 00:16:32.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.809 "is_configured": false, 00:16:32.809 "data_offset": 2048, 00:16:32.809 "data_size": 63488 00:16:32.809 }, 00:16:32.809 { 00:16:32.809 "name": "BaseBdev2", 00:16:32.809 "uuid": "3f08b71f-0e02-4e3a-aa07-da314cb2b259", 00:16:32.809 "is_configured": true, 00:16:32.809 "data_offset": 2048, 00:16:32.809 "data_size": 63488 00:16:32.809 }, 00:16:32.809 { 00:16:32.809 "name": "BaseBdev3", 00:16:32.809 "uuid": "f7591c63-2278-46b6-ae18-52a4796ffb44", 00:16:32.809 "is_configured": true, 00:16:32.809 "data_offset": 2048, 00:16:32.809 "data_size": 63488 00:16:32.809 } 00:16:32.809 ] 00:16:32.809 }' 00:16:32.810 13:42:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.810 13:42:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:33.377 13:42:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:33.377 13:42:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:33.377 13:42:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.377 13:42:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:33.377 13:42:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:33.377 13:42:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:33.377 13:42:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:33.636 [2024-07-12 13:42:22.178034] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:33.636 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:33.636 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:33.636 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.636 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:33.894 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:33.894 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:33.894 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:34.152 [2024-07-12 13:42:22.671883] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:34.152 [2024-07-12 13:42:22.671986] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:34.152 [2024-07-12 13:42:22.684450] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:34.152 [2024-07-12 13:42:22.684487] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:34.152 [2024-07-12 13:42:22.684499] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb18b10 name Existed_Raid, state offline 00:16:34.152 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:34.152 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:34.152 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.152 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:34.410 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:34.410 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:34.410 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:34.410 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:34.410 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:34.410 13:42:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:34.668 BaseBdev2 00:16:34.668 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:34.668 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:34.668 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:34.668 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:34.668 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:34.668 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:34.668 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:34.926 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:35.184 [ 00:16:35.184 { 00:16:35.184 "name": "BaseBdev2", 00:16:35.184 "aliases": [ 00:16:35.184 "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6" 00:16:35.184 ], 00:16:35.184 "product_name": "Malloc disk", 00:16:35.184 "block_size": 512, 00:16:35.184 "num_blocks": 65536, 00:16:35.184 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:35.184 "assigned_rate_limits": { 00:16:35.184 "rw_ios_per_sec": 0, 00:16:35.184 "rw_mbytes_per_sec": 0, 00:16:35.184 "r_mbytes_per_sec": 0, 00:16:35.184 "w_mbytes_per_sec": 0 00:16:35.184 }, 00:16:35.184 "claimed": false, 00:16:35.184 "zoned": false, 00:16:35.184 "supported_io_types": { 00:16:35.184 "read": true, 00:16:35.184 "write": true, 00:16:35.184 "unmap": true, 00:16:35.184 "flush": true, 00:16:35.184 "reset": true, 00:16:35.184 "nvme_admin": false, 00:16:35.184 "nvme_io": false, 00:16:35.184 "nvme_io_md": false, 00:16:35.184 "write_zeroes": true, 00:16:35.184 "zcopy": true, 00:16:35.184 "get_zone_info": false, 00:16:35.184 "zone_management": false, 00:16:35.184 "zone_append": false, 00:16:35.184 "compare": false, 00:16:35.184 "compare_and_write": false, 00:16:35.184 "abort": true, 00:16:35.184 "seek_hole": false, 00:16:35.184 "seek_data": false, 00:16:35.184 "copy": true, 00:16:35.184 "nvme_iov_md": false 00:16:35.184 }, 00:16:35.184 "memory_domains": [ 00:16:35.184 { 00:16:35.184 "dma_device_id": "system", 00:16:35.184 "dma_device_type": 1 00:16:35.184 }, 00:16:35.184 { 00:16:35.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.184 "dma_device_type": 2 00:16:35.184 } 00:16:35.184 ], 00:16:35.184 "driver_specific": {} 00:16:35.184 } 00:16:35.184 ] 00:16:35.184 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:35.184 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:35.184 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:35.184 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:35.443 BaseBdev3 00:16:35.443 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:35.443 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:35.443 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:35.443 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:35.443 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:35.443 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:35.443 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:35.701 13:42:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:35.960 [ 00:16:35.960 { 00:16:35.960 "name": "BaseBdev3", 00:16:35.960 "aliases": [ 00:16:35.960 "67b46dd0-929c-4b4b-8184-ded9b4774756" 00:16:35.960 ], 00:16:35.960 "product_name": "Malloc disk", 00:16:35.960 "block_size": 512, 00:16:35.960 "num_blocks": 65536, 00:16:35.960 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:35.960 "assigned_rate_limits": { 00:16:35.960 "rw_ios_per_sec": 0, 00:16:35.960 "rw_mbytes_per_sec": 0, 00:16:35.960 "r_mbytes_per_sec": 0, 00:16:35.960 "w_mbytes_per_sec": 0 00:16:35.960 }, 00:16:35.960 "claimed": false, 00:16:35.960 "zoned": false, 00:16:35.960 "supported_io_types": { 00:16:35.960 "read": true, 00:16:35.960 "write": true, 00:16:35.960 "unmap": true, 00:16:35.960 "flush": true, 00:16:35.960 "reset": true, 00:16:35.960 "nvme_admin": false, 00:16:35.960 "nvme_io": false, 00:16:35.960 "nvme_io_md": false, 00:16:35.960 "write_zeroes": true, 00:16:35.960 "zcopy": true, 00:16:35.960 "get_zone_info": false, 00:16:35.960 "zone_management": false, 00:16:35.960 "zone_append": false, 00:16:35.960 "compare": false, 00:16:35.960 "compare_and_write": false, 00:16:35.960 "abort": true, 00:16:35.960 "seek_hole": false, 00:16:35.960 "seek_data": false, 00:16:35.960 "copy": true, 00:16:35.960 "nvme_iov_md": false 00:16:35.960 }, 00:16:35.960 "memory_domains": [ 00:16:35.960 { 00:16:35.960 "dma_device_id": "system", 00:16:35.960 "dma_device_type": 1 00:16:35.960 }, 00:16:35.960 { 00:16:35.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.960 "dma_device_type": 2 00:16:35.960 } 00:16:35.960 ], 00:16:35.960 "driver_specific": {} 00:16:35.960 } 00:16:35.960 ] 00:16:35.960 13:42:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:35.960 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:35.960 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:35.960 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:36.219 [2024-07-12 13:42:24.623281] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:36.219 [2024-07-12 13:42:24.623322] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:36.219 [2024-07-12 13:42:24.623341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:36.219 [2024-07-12 13:42:24.624680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.219 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.478 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.478 "name": "Existed_Raid", 00:16:36.478 "uuid": "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6", 00:16:36.478 "strip_size_kb": 0, 00:16:36.478 "state": "configuring", 00:16:36.478 "raid_level": "raid1", 00:16:36.478 "superblock": true, 00:16:36.478 "num_base_bdevs": 3, 00:16:36.478 "num_base_bdevs_discovered": 2, 00:16:36.478 "num_base_bdevs_operational": 3, 00:16:36.478 "base_bdevs_list": [ 00:16:36.478 { 00:16:36.478 "name": "BaseBdev1", 00:16:36.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.478 "is_configured": false, 00:16:36.478 "data_offset": 0, 00:16:36.478 "data_size": 0 00:16:36.478 }, 00:16:36.478 { 00:16:36.478 "name": "BaseBdev2", 00:16:36.478 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:36.478 "is_configured": true, 00:16:36.478 "data_offset": 2048, 00:16:36.478 "data_size": 63488 00:16:36.478 }, 00:16:36.478 { 00:16:36.478 "name": "BaseBdev3", 00:16:36.478 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:36.478 "is_configured": true, 00:16:36.478 "data_offset": 2048, 00:16:36.478 "data_size": 63488 00:16:36.478 } 00:16:36.478 ] 00:16:36.478 }' 00:16:36.478 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.478 13:42:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:37.045 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:37.045 [2024-07-12 13:42:25.625910] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:37.327 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:37.327 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.328 "name": "Existed_Raid", 00:16:37.328 "uuid": "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6", 00:16:37.328 "strip_size_kb": 0, 00:16:37.328 "state": "configuring", 00:16:37.328 "raid_level": "raid1", 00:16:37.328 "superblock": true, 00:16:37.328 "num_base_bdevs": 3, 00:16:37.328 "num_base_bdevs_discovered": 1, 00:16:37.328 "num_base_bdevs_operational": 3, 00:16:37.328 "base_bdevs_list": [ 00:16:37.328 { 00:16:37.328 "name": "BaseBdev1", 00:16:37.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.328 "is_configured": false, 00:16:37.328 "data_offset": 0, 00:16:37.328 "data_size": 0 00:16:37.328 }, 00:16:37.328 { 00:16:37.328 "name": null, 00:16:37.328 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:37.328 "is_configured": false, 00:16:37.328 "data_offset": 2048, 00:16:37.328 "data_size": 63488 00:16:37.328 }, 00:16:37.328 { 00:16:37.328 "name": "BaseBdev3", 00:16:37.328 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:37.328 "is_configured": true, 00:16:37.328 "data_offset": 2048, 00:16:37.328 "data_size": 63488 00:16:37.328 } 00:16:37.328 ] 00:16:37.328 }' 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:37.328 13:42:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:38.265 13:42:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.265 13:42:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:38.265 13:42:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:38.265 13:42:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:38.524 [2024-07-12 13:42:26.993123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:38.524 BaseBdev1 00:16:38.524 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:38.524 13:42:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:38.524 13:42:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:38.524 13:42:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:38.524 13:42:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:38.524 13:42:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:38.524 13:42:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:38.784 13:42:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:39.043 [ 00:16:39.043 { 00:16:39.043 "name": "BaseBdev1", 00:16:39.043 "aliases": [ 00:16:39.043 "aa14213d-dafb-413c-a96e-65f24c497e60" 00:16:39.043 ], 00:16:39.043 "product_name": "Malloc disk", 00:16:39.043 "block_size": 512, 00:16:39.043 "num_blocks": 65536, 00:16:39.043 "uuid": "aa14213d-dafb-413c-a96e-65f24c497e60", 00:16:39.043 "assigned_rate_limits": { 00:16:39.043 "rw_ios_per_sec": 0, 00:16:39.043 "rw_mbytes_per_sec": 0, 00:16:39.043 "r_mbytes_per_sec": 0, 00:16:39.043 "w_mbytes_per_sec": 0 00:16:39.043 }, 00:16:39.043 "claimed": true, 00:16:39.043 "claim_type": "exclusive_write", 00:16:39.043 "zoned": false, 00:16:39.043 "supported_io_types": { 00:16:39.044 "read": true, 00:16:39.044 "write": true, 00:16:39.044 "unmap": true, 00:16:39.044 "flush": true, 00:16:39.044 "reset": true, 00:16:39.044 "nvme_admin": false, 00:16:39.044 "nvme_io": false, 00:16:39.044 "nvme_io_md": false, 00:16:39.044 "write_zeroes": true, 00:16:39.044 "zcopy": true, 00:16:39.044 "get_zone_info": false, 00:16:39.044 "zone_management": false, 00:16:39.044 "zone_append": false, 00:16:39.044 "compare": false, 00:16:39.044 "compare_and_write": false, 00:16:39.044 "abort": true, 00:16:39.044 "seek_hole": false, 00:16:39.044 "seek_data": false, 00:16:39.044 "copy": true, 00:16:39.044 "nvme_iov_md": false 00:16:39.044 }, 00:16:39.044 "memory_domains": [ 00:16:39.044 { 00:16:39.044 "dma_device_id": "system", 00:16:39.044 "dma_device_type": 1 00:16:39.044 }, 00:16:39.044 { 00:16:39.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.044 "dma_device_type": 2 00:16:39.044 } 00:16:39.044 ], 00:16:39.044 "driver_specific": {} 00:16:39.044 } 00:16:39.044 ] 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.044 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.303 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.303 "name": "Existed_Raid", 00:16:39.303 "uuid": "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6", 00:16:39.303 "strip_size_kb": 0, 00:16:39.303 "state": "configuring", 00:16:39.303 "raid_level": "raid1", 00:16:39.303 "superblock": true, 00:16:39.303 "num_base_bdevs": 3, 00:16:39.303 "num_base_bdevs_discovered": 2, 00:16:39.303 "num_base_bdevs_operational": 3, 00:16:39.303 "base_bdevs_list": [ 00:16:39.303 { 00:16:39.303 "name": "BaseBdev1", 00:16:39.303 "uuid": "aa14213d-dafb-413c-a96e-65f24c497e60", 00:16:39.303 "is_configured": true, 00:16:39.303 "data_offset": 2048, 00:16:39.303 "data_size": 63488 00:16:39.303 }, 00:16:39.303 { 00:16:39.303 "name": null, 00:16:39.303 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:39.303 "is_configured": false, 00:16:39.303 "data_offset": 2048, 00:16:39.303 "data_size": 63488 00:16:39.303 }, 00:16:39.303 { 00:16:39.303 "name": "BaseBdev3", 00:16:39.303 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:39.303 "is_configured": true, 00:16:39.303 "data_offset": 2048, 00:16:39.303 "data_size": 63488 00:16:39.303 } 00:16:39.303 ] 00:16:39.303 }' 00:16:39.303 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.303 13:42:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:39.870 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.870 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:40.130 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:40.130 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:40.389 [2024-07-12 13:42:28.821992] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.389 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.648 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.648 "name": "Existed_Raid", 00:16:40.648 "uuid": "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6", 00:16:40.648 "strip_size_kb": 0, 00:16:40.648 "state": "configuring", 00:16:40.648 "raid_level": "raid1", 00:16:40.648 "superblock": true, 00:16:40.648 "num_base_bdevs": 3, 00:16:40.648 "num_base_bdevs_discovered": 1, 00:16:40.648 "num_base_bdevs_operational": 3, 00:16:40.648 "base_bdevs_list": [ 00:16:40.648 { 00:16:40.648 "name": "BaseBdev1", 00:16:40.648 "uuid": "aa14213d-dafb-413c-a96e-65f24c497e60", 00:16:40.648 "is_configured": true, 00:16:40.648 "data_offset": 2048, 00:16:40.648 "data_size": 63488 00:16:40.648 }, 00:16:40.648 { 00:16:40.648 "name": null, 00:16:40.648 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:40.648 "is_configured": false, 00:16:40.648 "data_offset": 2048, 00:16:40.648 "data_size": 63488 00:16:40.648 }, 00:16:40.648 { 00:16:40.648 "name": null, 00:16:40.648 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:40.648 "is_configured": false, 00:16:40.648 "data_offset": 2048, 00:16:40.648 "data_size": 63488 00:16:40.648 } 00:16:40.648 ] 00:16:40.648 }' 00:16:40.648 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.648 13:42:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:41.215 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.215 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:41.474 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:41.474 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:41.735 [2024-07-12 13:42:30.093427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.735 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.994 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.994 "name": "Existed_Raid", 00:16:41.994 "uuid": "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6", 00:16:41.994 "strip_size_kb": 0, 00:16:41.994 "state": "configuring", 00:16:41.994 "raid_level": "raid1", 00:16:41.994 "superblock": true, 00:16:41.994 "num_base_bdevs": 3, 00:16:41.994 "num_base_bdevs_discovered": 2, 00:16:41.994 "num_base_bdevs_operational": 3, 00:16:41.994 "base_bdevs_list": [ 00:16:41.994 { 00:16:41.994 "name": "BaseBdev1", 00:16:41.994 "uuid": "aa14213d-dafb-413c-a96e-65f24c497e60", 00:16:41.994 "is_configured": true, 00:16:41.994 "data_offset": 2048, 00:16:41.994 "data_size": 63488 00:16:41.994 }, 00:16:41.994 { 00:16:41.994 "name": null, 00:16:41.994 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:41.994 "is_configured": false, 00:16:41.994 "data_offset": 2048, 00:16:41.994 "data_size": 63488 00:16:41.994 }, 00:16:41.994 { 00:16:41.994 "name": "BaseBdev3", 00:16:41.994 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:41.994 "is_configured": true, 00:16:41.994 "data_offset": 2048, 00:16:41.994 "data_size": 63488 00:16:41.994 } 00:16:41.994 ] 00:16:41.994 }' 00:16:41.994 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.994 13:42:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:42.560 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:42.560 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.819 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:42.819 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:43.077 [2024-07-12 13:42:31.457047] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:43.077 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:43.077 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.077 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.077 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:43.077 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:43.078 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:43.078 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.078 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.078 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.078 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.078 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.078 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.336 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.336 "name": "Existed_Raid", 00:16:43.336 "uuid": "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6", 00:16:43.336 "strip_size_kb": 0, 00:16:43.336 "state": "configuring", 00:16:43.336 "raid_level": "raid1", 00:16:43.336 "superblock": true, 00:16:43.336 "num_base_bdevs": 3, 00:16:43.336 "num_base_bdevs_discovered": 1, 00:16:43.336 "num_base_bdevs_operational": 3, 00:16:43.336 "base_bdevs_list": [ 00:16:43.336 { 00:16:43.336 "name": null, 00:16:43.336 "uuid": "aa14213d-dafb-413c-a96e-65f24c497e60", 00:16:43.336 "is_configured": false, 00:16:43.336 "data_offset": 2048, 00:16:43.336 "data_size": 63488 00:16:43.336 }, 00:16:43.336 { 00:16:43.336 "name": null, 00:16:43.336 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:43.336 "is_configured": false, 00:16:43.336 "data_offset": 2048, 00:16:43.336 "data_size": 63488 00:16:43.336 }, 00:16:43.336 { 00:16:43.336 "name": "BaseBdev3", 00:16:43.336 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:43.336 "is_configured": true, 00:16:43.336 "data_offset": 2048, 00:16:43.336 "data_size": 63488 00:16:43.336 } 00:16:43.336 ] 00:16:43.336 }' 00:16:43.336 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.336 13:42:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:43.902 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.902 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:44.160 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:44.160 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:44.419 [2024-07-12 13:42:32.784976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.419 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.677 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.677 "name": "Existed_Raid", 00:16:44.677 "uuid": "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6", 00:16:44.677 "strip_size_kb": 0, 00:16:44.677 "state": "configuring", 00:16:44.677 "raid_level": "raid1", 00:16:44.677 "superblock": true, 00:16:44.677 "num_base_bdevs": 3, 00:16:44.677 "num_base_bdevs_discovered": 2, 00:16:44.677 "num_base_bdevs_operational": 3, 00:16:44.677 "base_bdevs_list": [ 00:16:44.677 { 00:16:44.677 "name": null, 00:16:44.677 "uuid": "aa14213d-dafb-413c-a96e-65f24c497e60", 00:16:44.677 "is_configured": false, 00:16:44.677 "data_offset": 2048, 00:16:44.677 "data_size": 63488 00:16:44.677 }, 00:16:44.677 { 00:16:44.677 "name": "BaseBdev2", 00:16:44.677 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:44.677 "is_configured": true, 00:16:44.677 "data_offset": 2048, 00:16:44.677 "data_size": 63488 00:16:44.677 }, 00:16:44.677 { 00:16:44.677 "name": "BaseBdev3", 00:16:44.677 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:44.677 "is_configured": true, 00:16:44.677 "data_offset": 2048, 00:16:44.677 "data_size": 63488 00:16:44.677 } 00:16:44.677 ] 00:16:44.677 }' 00:16:44.677 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.677 13:42:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.243 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:45.243 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.502 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:45.502 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.502 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:45.760 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u aa14213d-dafb-413c-a96e-65f24c497e60 00:16:46.020 [2024-07-12 13:42:34.373548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:46.020 [2024-07-12 13:42:34.373707] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcbc510 00:16:46.020 [2024-07-12 13:42:34.373720] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:46.020 [2024-07-12 13:42:34.373899] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb18320 00:16:46.020 [2024-07-12 13:42:34.374034] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcbc510 00:16:46.020 [2024-07-12 13:42:34.374045] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcbc510 00:16:46.020 [2024-07-12 13:42:34.374144] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:46.020 NewBaseBdev 00:16:46.020 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:46.020 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:46.020 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:46.020 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:46.020 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:46.020 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:46.020 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:46.280 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:46.539 [ 00:16:46.539 { 00:16:46.539 "name": "NewBaseBdev", 00:16:46.539 "aliases": [ 00:16:46.539 "aa14213d-dafb-413c-a96e-65f24c497e60" 00:16:46.539 ], 00:16:46.539 "product_name": "Malloc disk", 00:16:46.539 "block_size": 512, 00:16:46.539 "num_blocks": 65536, 00:16:46.539 "uuid": "aa14213d-dafb-413c-a96e-65f24c497e60", 00:16:46.539 "assigned_rate_limits": { 00:16:46.539 "rw_ios_per_sec": 0, 00:16:46.539 "rw_mbytes_per_sec": 0, 00:16:46.539 "r_mbytes_per_sec": 0, 00:16:46.539 "w_mbytes_per_sec": 0 00:16:46.539 }, 00:16:46.539 "claimed": true, 00:16:46.539 "claim_type": "exclusive_write", 00:16:46.539 "zoned": false, 00:16:46.540 "supported_io_types": { 00:16:46.540 "read": true, 00:16:46.540 "write": true, 00:16:46.540 "unmap": true, 00:16:46.540 "flush": true, 00:16:46.540 "reset": true, 00:16:46.540 "nvme_admin": false, 00:16:46.540 "nvme_io": false, 00:16:46.540 "nvme_io_md": false, 00:16:46.540 "write_zeroes": true, 00:16:46.540 "zcopy": true, 00:16:46.540 "get_zone_info": false, 00:16:46.540 "zone_management": false, 00:16:46.540 "zone_append": false, 00:16:46.540 "compare": false, 00:16:46.540 "compare_and_write": false, 00:16:46.540 "abort": true, 00:16:46.540 "seek_hole": false, 00:16:46.540 "seek_data": false, 00:16:46.540 "copy": true, 00:16:46.540 "nvme_iov_md": false 00:16:46.540 }, 00:16:46.540 "memory_domains": [ 00:16:46.540 { 00:16:46.540 "dma_device_id": "system", 00:16:46.540 "dma_device_type": 1 00:16:46.540 }, 00:16:46.540 { 00:16:46.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:46.540 "dma_device_type": 2 00:16:46.540 } 00:16:46.540 ], 00:16:46.540 "driver_specific": {} 00:16:46.540 } 00:16:46.540 ] 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.540 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.799 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.799 "name": "Existed_Raid", 00:16:46.799 "uuid": "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6", 00:16:46.799 "strip_size_kb": 0, 00:16:46.799 "state": "online", 00:16:46.799 "raid_level": "raid1", 00:16:46.799 "superblock": true, 00:16:46.799 "num_base_bdevs": 3, 00:16:46.799 "num_base_bdevs_discovered": 3, 00:16:46.799 "num_base_bdevs_operational": 3, 00:16:46.799 "base_bdevs_list": [ 00:16:46.799 { 00:16:46.799 "name": "NewBaseBdev", 00:16:46.799 "uuid": "aa14213d-dafb-413c-a96e-65f24c497e60", 00:16:46.799 "is_configured": true, 00:16:46.799 "data_offset": 2048, 00:16:46.799 "data_size": 63488 00:16:46.799 }, 00:16:46.799 { 00:16:46.799 "name": "BaseBdev2", 00:16:46.799 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:46.799 "is_configured": true, 00:16:46.799 "data_offset": 2048, 00:16:46.799 "data_size": 63488 00:16:46.799 }, 00:16:46.799 { 00:16:46.799 "name": "BaseBdev3", 00:16:46.799 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:46.799 "is_configured": true, 00:16:46.799 "data_offset": 2048, 00:16:46.799 "data_size": 63488 00:16:46.799 } 00:16:46.799 ] 00:16:46.799 }' 00:16:46.799 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.799 13:42:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.367 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:47.367 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:47.367 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:47.367 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:47.367 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:47.367 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:47.367 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:47.367 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:47.367 [2024-07-12 13:42:35.885853] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:47.367 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:47.367 "name": "Existed_Raid", 00:16:47.367 "aliases": [ 00:16:47.367 "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6" 00:16:47.367 ], 00:16:47.367 "product_name": "Raid Volume", 00:16:47.367 "block_size": 512, 00:16:47.367 "num_blocks": 63488, 00:16:47.367 "uuid": "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6", 00:16:47.367 "assigned_rate_limits": { 00:16:47.367 "rw_ios_per_sec": 0, 00:16:47.367 "rw_mbytes_per_sec": 0, 00:16:47.367 "r_mbytes_per_sec": 0, 00:16:47.367 "w_mbytes_per_sec": 0 00:16:47.367 }, 00:16:47.367 "claimed": false, 00:16:47.367 "zoned": false, 00:16:47.367 "supported_io_types": { 00:16:47.367 "read": true, 00:16:47.367 "write": true, 00:16:47.367 "unmap": false, 00:16:47.367 "flush": false, 00:16:47.367 "reset": true, 00:16:47.367 "nvme_admin": false, 00:16:47.367 "nvme_io": false, 00:16:47.367 "nvme_io_md": false, 00:16:47.367 "write_zeroes": true, 00:16:47.367 "zcopy": false, 00:16:47.367 "get_zone_info": false, 00:16:47.367 "zone_management": false, 00:16:47.367 "zone_append": false, 00:16:47.367 "compare": false, 00:16:47.367 "compare_and_write": false, 00:16:47.367 "abort": false, 00:16:47.367 "seek_hole": false, 00:16:47.367 "seek_data": false, 00:16:47.367 "copy": false, 00:16:47.367 "nvme_iov_md": false 00:16:47.367 }, 00:16:47.367 "memory_domains": [ 00:16:47.367 { 00:16:47.367 "dma_device_id": "system", 00:16:47.367 "dma_device_type": 1 00:16:47.367 }, 00:16:47.367 { 00:16:47.367 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.367 "dma_device_type": 2 00:16:47.367 }, 00:16:47.367 { 00:16:47.367 "dma_device_id": "system", 00:16:47.367 "dma_device_type": 1 00:16:47.367 }, 00:16:47.367 { 00:16:47.367 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.367 "dma_device_type": 2 00:16:47.367 }, 00:16:47.367 { 00:16:47.367 "dma_device_id": "system", 00:16:47.367 "dma_device_type": 1 00:16:47.367 }, 00:16:47.368 { 00:16:47.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.368 "dma_device_type": 2 00:16:47.368 } 00:16:47.368 ], 00:16:47.368 "driver_specific": { 00:16:47.368 "raid": { 00:16:47.368 "uuid": "0ae4e731-c1d4-4348-bb68-bf91bcbe7fb6", 00:16:47.368 "strip_size_kb": 0, 00:16:47.368 "state": "online", 00:16:47.368 "raid_level": "raid1", 00:16:47.368 "superblock": true, 00:16:47.368 "num_base_bdevs": 3, 00:16:47.368 "num_base_bdevs_discovered": 3, 00:16:47.368 "num_base_bdevs_operational": 3, 00:16:47.368 "base_bdevs_list": [ 00:16:47.368 { 00:16:47.368 "name": "NewBaseBdev", 00:16:47.368 "uuid": "aa14213d-dafb-413c-a96e-65f24c497e60", 00:16:47.368 "is_configured": true, 00:16:47.368 "data_offset": 2048, 00:16:47.368 "data_size": 63488 00:16:47.368 }, 00:16:47.368 { 00:16:47.368 "name": "BaseBdev2", 00:16:47.368 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:47.368 "is_configured": true, 00:16:47.368 "data_offset": 2048, 00:16:47.368 "data_size": 63488 00:16:47.368 }, 00:16:47.368 { 00:16:47.368 "name": "BaseBdev3", 00:16:47.368 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:47.368 "is_configured": true, 00:16:47.368 "data_offset": 2048, 00:16:47.368 "data_size": 63488 00:16:47.368 } 00:16:47.368 ] 00:16:47.368 } 00:16:47.368 } 00:16:47.368 }' 00:16:47.368 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:47.627 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:47.627 BaseBdev2 00:16:47.627 BaseBdev3' 00:16:47.627 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:47.627 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:47.627 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:47.627 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:47.627 "name": "NewBaseBdev", 00:16:47.627 "aliases": [ 00:16:47.627 "aa14213d-dafb-413c-a96e-65f24c497e60" 00:16:47.627 ], 00:16:47.627 "product_name": "Malloc disk", 00:16:47.627 "block_size": 512, 00:16:47.627 "num_blocks": 65536, 00:16:47.627 "uuid": "aa14213d-dafb-413c-a96e-65f24c497e60", 00:16:47.627 "assigned_rate_limits": { 00:16:47.627 "rw_ios_per_sec": 0, 00:16:47.627 "rw_mbytes_per_sec": 0, 00:16:47.627 "r_mbytes_per_sec": 0, 00:16:47.627 "w_mbytes_per_sec": 0 00:16:47.627 }, 00:16:47.627 "claimed": true, 00:16:47.627 "claim_type": "exclusive_write", 00:16:47.627 "zoned": false, 00:16:47.627 "supported_io_types": { 00:16:47.627 "read": true, 00:16:47.627 "write": true, 00:16:47.627 "unmap": true, 00:16:47.627 "flush": true, 00:16:47.627 "reset": true, 00:16:47.627 "nvme_admin": false, 00:16:47.627 "nvme_io": false, 00:16:47.627 "nvme_io_md": false, 00:16:47.627 "write_zeroes": true, 00:16:47.627 "zcopy": true, 00:16:47.627 "get_zone_info": false, 00:16:47.627 "zone_management": false, 00:16:47.627 "zone_append": false, 00:16:47.627 "compare": false, 00:16:47.627 "compare_and_write": false, 00:16:47.627 "abort": true, 00:16:47.627 "seek_hole": false, 00:16:47.627 "seek_data": false, 00:16:47.627 "copy": true, 00:16:47.627 "nvme_iov_md": false 00:16:47.627 }, 00:16:47.627 "memory_domains": [ 00:16:47.627 { 00:16:47.627 "dma_device_id": "system", 00:16:47.627 "dma_device_type": 1 00:16:47.627 }, 00:16:47.627 { 00:16:47.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.627 "dma_device_type": 2 00:16:47.627 } 00:16:47.627 ], 00:16:47.627 "driver_specific": {} 00:16:47.627 }' 00:16:47.627 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:47.886 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:47.886 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:47.886 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:47.886 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:47.886 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:47.886 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:47.886 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:47.886 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:47.886 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:48.145 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:48.145 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:48.145 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:48.145 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:48.145 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:48.404 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:48.404 "name": "BaseBdev2", 00:16:48.404 "aliases": [ 00:16:48.404 "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6" 00:16:48.404 ], 00:16:48.404 "product_name": "Malloc disk", 00:16:48.404 "block_size": 512, 00:16:48.404 "num_blocks": 65536, 00:16:48.404 "uuid": "1f148e6e-3f4c-4f39-a0c1-5253fc5347f6", 00:16:48.404 "assigned_rate_limits": { 00:16:48.404 "rw_ios_per_sec": 0, 00:16:48.404 "rw_mbytes_per_sec": 0, 00:16:48.404 "r_mbytes_per_sec": 0, 00:16:48.404 "w_mbytes_per_sec": 0 00:16:48.404 }, 00:16:48.404 "claimed": true, 00:16:48.404 "claim_type": "exclusive_write", 00:16:48.404 "zoned": false, 00:16:48.404 "supported_io_types": { 00:16:48.404 "read": true, 00:16:48.404 "write": true, 00:16:48.404 "unmap": true, 00:16:48.404 "flush": true, 00:16:48.404 "reset": true, 00:16:48.404 "nvme_admin": false, 00:16:48.404 "nvme_io": false, 00:16:48.404 "nvme_io_md": false, 00:16:48.404 "write_zeroes": true, 00:16:48.404 "zcopy": true, 00:16:48.404 "get_zone_info": false, 00:16:48.404 "zone_management": false, 00:16:48.404 "zone_append": false, 00:16:48.404 "compare": false, 00:16:48.404 "compare_and_write": false, 00:16:48.404 "abort": true, 00:16:48.404 "seek_hole": false, 00:16:48.404 "seek_data": false, 00:16:48.404 "copy": true, 00:16:48.404 "nvme_iov_md": false 00:16:48.404 }, 00:16:48.404 "memory_domains": [ 00:16:48.404 { 00:16:48.404 "dma_device_id": "system", 00:16:48.404 "dma_device_type": 1 00:16:48.404 }, 00:16:48.404 { 00:16:48.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.404 "dma_device_type": 2 00:16:48.404 } 00:16:48.404 ], 00:16:48.404 "driver_specific": {} 00:16:48.404 }' 00:16:48.404 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.404 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.404 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:48.404 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.404 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.404 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:48.404 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:48.663 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:48.663 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:48.663 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:48.663 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:48.663 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:48.663 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:48.663 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:48.663 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:48.922 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:48.922 "name": "BaseBdev3", 00:16:48.922 "aliases": [ 00:16:48.922 "67b46dd0-929c-4b4b-8184-ded9b4774756" 00:16:48.922 ], 00:16:48.922 "product_name": "Malloc disk", 00:16:48.922 "block_size": 512, 00:16:48.922 "num_blocks": 65536, 00:16:48.922 "uuid": "67b46dd0-929c-4b4b-8184-ded9b4774756", 00:16:48.922 "assigned_rate_limits": { 00:16:48.922 "rw_ios_per_sec": 0, 00:16:48.922 "rw_mbytes_per_sec": 0, 00:16:48.922 "r_mbytes_per_sec": 0, 00:16:48.922 "w_mbytes_per_sec": 0 00:16:48.922 }, 00:16:48.922 "claimed": true, 00:16:48.922 "claim_type": "exclusive_write", 00:16:48.922 "zoned": false, 00:16:48.922 "supported_io_types": { 00:16:48.922 "read": true, 00:16:48.922 "write": true, 00:16:48.922 "unmap": true, 00:16:48.922 "flush": true, 00:16:48.922 "reset": true, 00:16:48.922 "nvme_admin": false, 00:16:48.922 "nvme_io": false, 00:16:48.922 "nvme_io_md": false, 00:16:48.922 "write_zeroes": true, 00:16:48.922 "zcopy": true, 00:16:48.922 "get_zone_info": false, 00:16:48.922 "zone_management": false, 00:16:48.922 "zone_append": false, 00:16:48.922 "compare": false, 00:16:48.922 "compare_and_write": false, 00:16:48.922 "abort": true, 00:16:48.922 "seek_hole": false, 00:16:48.922 "seek_data": false, 00:16:48.922 "copy": true, 00:16:48.922 "nvme_iov_md": false 00:16:48.922 }, 00:16:48.922 "memory_domains": [ 00:16:48.922 { 00:16:48.922 "dma_device_id": "system", 00:16:48.922 "dma_device_type": 1 00:16:48.922 }, 00:16:48.922 { 00:16:48.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.922 "dma_device_type": 2 00:16:48.922 } 00:16:48.922 ], 00:16:48.922 "driver_specific": {} 00:16:48.922 }' 00:16:48.922 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.922 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.922 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:48.922 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.181 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.181 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:49.181 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.181 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.181 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:49.181 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.181 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.440 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:49.440 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:49.440 [2024-07-12 13:42:37.991150] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:49.440 [2024-07-12 13:42:37.991180] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:49.440 [2024-07-12 13:42:37.991233] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:49.440 [2024-07-12 13:42:37.991510] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:49.440 [2024-07-12 13:42:37.991523] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcbc510 name Existed_Raid, state offline 00:16:49.440 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 479409 00:16:49.440 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 479409 ']' 00:16:49.440 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 479409 00:16:49.440 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:49.440 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:49.440 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 479409 00:16:49.699 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:49.699 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:49.699 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 479409' 00:16:49.699 killing process with pid 479409 00:16:49.699 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 479409 00:16:49.699 [2024-07-12 13:42:38.050442] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:49.699 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 479409 00:16:49.699 [2024-07-12 13:42:38.080809] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:49.958 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:49.958 00:16:49.958 real 0m28.463s 00:16:49.958 user 0m52.194s 00:16:49.958 sys 0m5.099s 00:16:49.958 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:49.958 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:49.958 ************************************ 00:16:49.958 END TEST raid_state_function_test_sb 00:16:49.958 ************************************ 00:16:49.958 13:42:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:49.959 13:42:38 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:16:49.959 13:42:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:49.959 13:42:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:49.959 13:42:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:49.959 ************************************ 00:16:49.959 START TEST raid_superblock_test 00:16:49.959 ************************************ 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=483693 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 483693 /var/tmp/spdk-raid.sock 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 483693 ']' 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:49.959 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:49.959 13:42:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.959 [2024-07-12 13:42:38.453253] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:16:49.959 [2024-07-12 13:42:38.453321] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483693 ] 00:16:50.218 [2024-07-12 13:42:38.583821] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:50.218 [2024-07-12 13:42:38.690360] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:50.218 [2024-07-12 13:42:38.760433] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:50.218 [2024-07-12 13:42:38.760472] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:51.153 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:51.153 malloc1 00:16:51.154 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:51.412 [2024-07-12 13:42:39.863991] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:51.412 [2024-07-12 13:42:39.864039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:51.412 [2024-07-12 13:42:39.864060] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1677e90 00:16:51.412 [2024-07-12 13:42:39.864073] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:51.412 [2024-07-12 13:42:39.865836] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:51.412 [2024-07-12 13:42:39.865867] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:51.412 pt1 00:16:51.412 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:51.412 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:51.412 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:51.412 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:51.412 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:51.412 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:51.412 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:51.412 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:51.412 13:42:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:51.670 malloc2 00:16:51.670 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:51.928 [2024-07-12 13:42:40.363196] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:51.929 [2024-07-12 13:42:40.363244] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:51.929 [2024-07-12 13:42:40.363262] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1715fb0 00:16:51.929 [2024-07-12 13:42:40.363275] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:51.929 [2024-07-12 13:42:40.364844] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:51.929 [2024-07-12 13:42:40.364872] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:51.929 pt2 00:16:51.929 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:51.929 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:51.929 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:51.929 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:51.929 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:51.929 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:51.929 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:51.929 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:51.929 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:52.187 malloc3 00:16:52.187 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:52.447 [2024-07-12 13:42:40.869085] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:52.447 [2024-07-12 13:42:40.869128] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:52.447 [2024-07-12 13:42:40.869145] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1716ce0 00:16:52.447 [2024-07-12 13:42:40.869158] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:52.447 [2024-07-12 13:42:40.870554] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:52.447 [2024-07-12 13:42:40.870582] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:52.447 pt3 00:16:52.447 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:52.447 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:52.447 13:42:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:52.707 [2024-07-12 13:42:41.121764] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:52.707 [2024-07-12 13:42:41.122978] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:52.707 [2024-07-12 13:42:41.123034] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:52.707 [2024-07-12 13:42:41.123182] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17199a0 00:16:52.707 [2024-07-12 13:42:41.123194] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:52.707 [2024-07-12 13:42:41.123374] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1676f80 00:16:52.707 [2024-07-12 13:42:41.123516] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17199a0 00:16:52.707 [2024-07-12 13:42:41.123527] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17199a0 00:16:52.707 [2024-07-12 13:42:41.123619] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.707 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:52.967 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.967 "name": "raid_bdev1", 00:16:52.967 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:16:52.967 "strip_size_kb": 0, 00:16:52.967 "state": "online", 00:16:52.967 "raid_level": "raid1", 00:16:52.967 "superblock": true, 00:16:52.967 "num_base_bdevs": 3, 00:16:52.967 "num_base_bdevs_discovered": 3, 00:16:52.967 "num_base_bdevs_operational": 3, 00:16:52.967 "base_bdevs_list": [ 00:16:52.967 { 00:16:52.967 "name": "pt1", 00:16:52.967 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:52.967 "is_configured": true, 00:16:52.967 "data_offset": 2048, 00:16:52.967 "data_size": 63488 00:16:52.967 }, 00:16:52.967 { 00:16:52.967 "name": "pt2", 00:16:52.967 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:52.967 "is_configured": true, 00:16:52.967 "data_offset": 2048, 00:16:52.967 "data_size": 63488 00:16:52.967 }, 00:16:52.967 { 00:16:52.967 "name": "pt3", 00:16:52.967 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:52.967 "is_configured": true, 00:16:52.967 "data_offset": 2048, 00:16:52.967 "data_size": 63488 00:16:52.967 } 00:16:52.967 ] 00:16:52.967 }' 00:16:52.967 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.967 13:42:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:53.535 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:53.535 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:53.535 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:53.535 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:53.535 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:53.535 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:53.535 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:53.535 13:42:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:53.535 [2024-07-12 13:42:42.080552] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:53.535 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:53.535 "name": "raid_bdev1", 00:16:53.535 "aliases": [ 00:16:53.535 "e4ebb042-1bf8-4a49-89d7-e53cb875e37b" 00:16:53.535 ], 00:16:53.535 "product_name": "Raid Volume", 00:16:53.535 "block_size": 512, 00:16:53.535 "num_blocks": 63488, 00:16:53.535 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:16:53.535 "assigned_rate_limits": { 00:16:53.535 "rw_ios_per_sec": 0, 00:16:53.535 "rw_mbytes_per_sec": 0, 00:16:53.535 "r_mbytes_per_sec": 0, 00:16:53.535 "w_mbytes_per_sec": 0 00:16:53.535 }, 00:16:53.535 "claimed": false, 00:16:53.535 "zoned": false, 00:16:53.535 "supported_io_types": { 00:16:53.535 "read": true, 00:16:53.535 "write": true, 00:16:53.535 "unmap": false, 00:16:53.535 "flush": false, 00:16:53.535 "reset": true, 00:16:53.535 "nvme_admin": false, 00:16:53.535 "nvme_io": false, 00:16:53.535 "nvme_io_md": false, 00:16:53.535 "write_zeroes": true, 00:16:53.535 "zcopy": false, 00:16:53.535 "get_zone_info": false, 00:16:53.535 "zone_management": false, 00:16:53.535 "zone_append": false, 00:16:53.535 "compare": false, 00:16:53.535 "compare_and_write": false, 00:16:53.535 "abort": false, 00:16:53.535 "seek_hole": false, 00:16:53.535 "seek_data": false, 00:16:53.535 "copy": false, 00:16:53.535 "nvme_iov_md": false 00:16:53.535 }, 00:16:53.535 "memory_domains": [ 00:16:53.535 { 00:16:53.535 "dma_device_id": "system", 00:16:53.535 "dma_device_type": 1 00:16:53.535 }, 00:16:53.535 { 00:16:53.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.535 "dma_device_type": 2 00:16:53.535 }, 00:16:53.535 { 00:16:53.535 "dma_device_id": "system", 00:16:53.535 "dma_device_type": 1 00:16:53.535 }, 00:16:53.535 { 00:16:53.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.535 "dma_device_type": 2 00:16:53.535 }, 00:16:53.535 { 00:16:53.535 "dma_device_id": "system", 00:16:53.535 "dma_device_type": 1 00:16:53.535 }, 00:16:53.535 { 00:16:53.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.535 "dma_device_type": 2 00:16:53.535 } 00:16:53.535 ], 00:16:53.535 "driver_specific": { 00:16:53.535 "raid": { 00:16:53.535 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:16:53.535 "strip_size_kb": 0, 00:16:53.535 "state": "online", 00:16:53.535 "raid_level": "raid1", 00:16:53.535 "superblock": true, 00:16:53.535 "num_base_bdevs": 3, 00:16:53.535 "num_base_bdevs_discovered": 3, 00:16:53.535 "num_base_bdevs_operational": 3, 00:16:53.535 "base_bdevs_list": [ 00:16:53.535 { 00:16:53.535 "name": "pt1", 00:16:53.535 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:53.535 "is_configured": true, 00:16:53.535 "data_offset": 2048, 00:16:53.535 "data_size": 63488 00:16:53.535 }, 00:16:53.535 { 00:16:53.535 "name": "pt2", 00:16:53.535 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:53.535 "is_configured": true, 00:16:53.535 "data_offset": 2048, 00:16:53.535 "data_size": 63488 00:16:53.535 }, 00:16:53.535 { 00:16:53.535 "name": "pt3", 00:16:53.535 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:53.535 "is_configured": true, 00:16:53.535 "data_offset": 2048, 00:16:53.535 "data_size": 63488 00:16:53.535 } 00:16:53.535 ] 00:16:53.536 } 00:16:53.536 } 00:16:53.536 }' 00:16:53.536 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:53.795 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:53.795 pt2 00:16:53.795 pt3' 00:16:53.795 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.795 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:53.795 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.054 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.054 "name": "pt1", 00:16:54.054 "aliases": [ 00:16:54.054 "00000000-0000-0000-0000-000000000001" 00:16:54.054 ], 00:16:54.054 "product_name": "passthru", 00:16:54.054 "block_size": 512, 00:16:54.054 "num_blocks": 65536, 00:16:54.054 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:54.054 "assigned_rate_limits": { 00:16:54.054 "rw_ios_per_sec": 0, 00:16:54.054 "rw_mbytes_per_sec": 0, 00:16:54.054 "r_mbytes_per_sec": 0, 00:16:54.054 "w_mbytes_per_sec": 0 00:16:54.054 }, 00:16:54.054 "claimed": true, 00:16:54.054 "claim_type": "exclusive_write", 00:16:54.054 "zoned": false, 00:16:54.054 "supported_io_types": { 00:16:54.054 "read": true, 00:16:54.054 "write": true, 00:16:54.054 "unmap": true, 00:16:54.054 "flush": true, 00:16:54.054 "reset": true, 00:16:54.054 "nvme_admin": false, 00:16:54.054 "nvme_io": false, 00:16:54.054 "nvme_io_md": false, 00:16:54.054 "write_zeroes": true, 00:16:54.054 "zcopy": true, 00:16:54.054 "get_zone_info": false, 00:16:54.054 "zone_management": false, 00:16:54.054 "zone_append": false, 00:16:54.054 "compare": false, 00:16:54.054 "compare_and_write": false, 00:16:54.054 "abort": true, 00:16:54.054 "seek_hole": false, 00:16:54.054 "seek_data": false, 00:16:54.054 "copy": true, 00:16:54.054 "nvme_iov_md": false 00:16:54.054 }, 00:16:54.054 "memory_domains": [ 00:16:54.054 { 00:16:54.054 "dma_device_id": "system", 00:16:54.054 "dma_device_type": 1 00:16:54.054 }, 00:16:54.054 { 00:16:54.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.054 "dma_device_type": 2 00:16:54.054 } 00:16:54.054 ], 00:16:54.054 "driver_specific": { 00:16:54.054 "passthru": { 00:16:54.054 "name": "pt1", 00:16:54.054 "base_bdev_name": "malloc1" 00:16:54.054 } 00:16:54.054 } 00:16:54.054 }' 00:16:54.054 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.054 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.054 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.054 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.054 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.054 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.054 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.054 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.313 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.313 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.313 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.313 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.313 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.313 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:54.313 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.571 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.571 "name": "pt2", 00:16:54.571 "aliases": [ 00:16:54.571 "00000000-0000-0000-0000-000000000002" 00:16:54.571 ], 00:16:54.571 "product_name": "passthru", 00:16:54.571 "block_size": 512, 00:16:54.571 "num_blocks": 65536, 00:16:54.571 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:54.571 "assigned_rate_limits": { 00:16:54.571 "rw_ios_per_sec": 0, 00:16:54.571 "rw_mbytes_per_sec": 0, 00:16:54.571 "r_mbytes_per_sec": 0, 00:16:54.571 "w_mbytes_per_sec": 0 00:16:54.571 }, 00:16:54.571 "claimed": true, 00:16:54.571 "claim_type": "exclusive_write", 00:16:54.571 "zoned": false, 00:16:54.571 "supported_io_types": { 00:16:54.571 "read": true, 00:16:54.571 "write": true, 00:16:54.571 "unmap": true, 00:16:54.571 "flush": true, 00:16:54.571 "reset": true, 00:16:54.571 "nvme_admin": false, 00:16:54.571 "nvme_io": false, 00:16:54.571 "nvme_io_md": false, 00:16:54.571 "write_zeroes": true, 00:16:54.571 "zcopy": true, 00:16:54.571 "get_zone_info": false, 00:16:54.571 "zone_management": false, 00:16:54.571 "zone_append": false, 00:16:54.571 "compare": false, 00:16:54.571 "compare_and_write": false, 00:16:54.571 "abort": true, 00:16:54.571 "seek_hole": false, 00:16:54.571 "seek_data": false, 00:16:54.571 "copy": true, 00:16:54.571 "nvme_iov_md": false 00:16:54.571 }, 00:16:54.571 "memory_domains": [ 00:16:54.571 { 00:16:54.571 "dma_device_id": "system", 00:16:54.571 "dma_device_type": 1 00:16:54.571 }, 00:16:54.571 { 00:16:54.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.571 "dma_device_type": 2 00:16:54.571 } 00:16:54.571 ], 00:16:54.571 "driver_specific": { 00:16:54.571 "passthru": { 00:16:54.571 "name": "pt2", 00:16:54.571 "base_bdev_name": "malloc2" 00:16:54.571 } 00:16:54.571 } 00:16:54.571 }' 00:16:54.571 13:42:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.571 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.571 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.571 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.571 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.829 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.829 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.829 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.829 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.829 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.829 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.829 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.829 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.830 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:54.830 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:55.088 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:55.088 "name": "pt3", 00:16:55.088 "aliases": [ 00:16:55.088 "00000000-0000-0000-0000-000000000003" 00:16:55.088 ], 00:16:55.088 "product_name": "passthru", 00:16:55.088 "block_size": 512, 00:16:55.088 "num_blocks": 65536, 00:16:55.088 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:55.088 "assigned_rate_limits": { 00:16:55.088 "rw_ios_per_sec": 0, 00:16:55.088 "rw_mbytes_per_sec": 0, 00:16:55.088 "r_mbytes_per_sec": 0, 00:16:55.088 "w_mbytes_per_sec": 0 00:16:55.088 }, 00:16:55.088 "claimed": true, 00:16:55.088 "claim_type": "exclusive_write", 00:16:55.088 "zoned": false, 00:16:55.088 "supported_io_types": { 00:16:55.089 "read": true, 00:16:55.089 "write": true, 00:16:55.089 "unmap": true, 00:16:55.089 "flush": true, 00:16:55.089 "reset": true, 00:16:55.089 "nvme_admin": false, 00:16:55.089 "nvme_io": false, 00:16:55.089 "nvme_io_md": false, 00:16:55.089 "write_zeroes": true, 00:16:55.089 "zcopy": true, 00:16:55.089 "get_zone_info": false, 00:16:55.089 "zone_management": false, 00:16:55.089 "zone_append": false, 00:16:55.089 "compare": false, 00:16:55.089 "compare_and_write": false, 00:16:55.089 "abort": true, 00:16:55.089 "seek_hole": false, 00:16:55.089 "seek_data": false, 00:16:55.089 "copy": true, 00:16:55.089 "nvme_iov_md": false 00:16:55.089 }, 00:16:55.089 "memory_domains": [ 00:16:55.089 { 00:16:55.089 "dma_device_id": "system", 00:16:55.089 "dma_device_type": 1 00:16:55.089 }, 00:16:55.089 { 00:16:55.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.089 "dma_device_type": 2 00:16:55.089 } 00:16:55.089 ], 00:16:55.089 "driver_specific": { 00:16:55.089 "passthru": { 00:16:55.089 "name": "pt3", 00:16:55.089 "base_bdev_name": "malloc3" 00:16:55.089 } 00:16:55.089 } 00:16:55.089 }' 00:16:55.089 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.089 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.347 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.347 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.347 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.347 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.347 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.347 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.347 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.347 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.347 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.606 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.606 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:55.606 13:42:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:55.606 [2024-07-12 13:42:44.178130] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:55.865 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=e4ebb042-1bf8-4a49-89d7-e53cb875e37b 00:16:55.865 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z e4ebb042-1bf8-4a49-89d7-e53cb875e37b ']' 00:16:55.865 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:55.865 [2024-07-12 13:42:44.422496] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:55.865 [2024-07-12 13:42:44.422518] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:55.865 [2024-07-12 13:42:44.422566] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:55.865 [2024-07-12 13:42:44.422637] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:55.865 [2024-07-12 13:42:44.422650] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17199a0 name raid_bdev1, state offline 00:16:55.865 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.865 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:56.124 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:56.124 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:56.124 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:56.124 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:56.382 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:56.382 13:42:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:56.642 13:42:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:56.642 13:42:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:56.901 13:42:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:56.901 13:42:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:57.160 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:57.419 [2024-07-12 13:42:45.878286] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:57.419 [2024-07-12 13:42:45.879679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:57.419 [2024-07-12 13:42:45.879723] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:57.419 [2024-07-12 13:42:45.879769] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:57.419 [2024-07-12 13:42:45.879810] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:57.419 [2024-07-12 13:42:45.879833] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:57.419 [2024-07-12 13:42:45.879851] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:57.419 [2024-07-12 13:42:45.879860] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x171aa20 name raid_bdev1, state configuring 00:16:57.419 request: 00:16:57.419 { 00:16:57.419 "name": "raid_bdev1", 00:16:57.419 "raid_level": "raid1", 00:16:57.419 "base_bdevs": [ 00:16:57.419 "malloc1", 00:16:57.419 "malloc2", 00:16:57.419 "malloc3" 00:16:57.419 ], 00:16:57.419 "superblock": false, 00:16:57.419 "method": "bdev_raid_create", 00:16:57.419 "req_id": 1 00:16:57.419 } 00:16:57.419 Got JSON-RPC error response 00:16:57.419 response: 00:16:57.419 { 00:16:57.419 "code": -17, 00:16:57.419 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:57.419 } 00:16:57.419 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:57.419 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:57.419 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:57.419 13:42:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:57.419 13:42:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.419 13:42:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:57.678 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:57.678 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:57.678 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:57.936 [2024-07-12 13:42:46.367522] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:57.936 [2024-07-12 13:42:46.367567] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.936 [2024-07-12 13:42:46.367585] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1717de0 00:16:57.936 [2024-07-12 13:42:46.367597] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.936 [2024-07-12 13:42:46.369190] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.936 [2024-07-12 13:42:46.369218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:57.936 [2024-07-12 13:42:46.369291] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:57.936 [2024-07-12 13:42:46.369318] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:57.936 pt1 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.936 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:58.194 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.194 "name": "raid_bdev1", 00:16:58.194 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:16:58.194 "strip_size_kb": 0, 00:16:58.194 "state": "configuring", 00:16:58.194 "raid_level": "raid1", 00:16:58.194 "superblock": true, 00:16:58.194 "num_base_bdevs": 3, 00:16:58.194 "num_base_bdevs_discovered": 1, 00:16:58.194 "num_base_bdevs_operational": 3, 00:16:58.194 "base_bdevs_list": [ 00:16:58.194 { 00:16:58.194 "name": "pt1", 00:16:58.194 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.194 "is_configured": true, 00:16:58.195 "data_offset": 2048, 00:16:58.195 "data_size": 63488 00:16:58.195 }, 00:16:58.195 { 00:16:58.195 "name": null, 00:16:58.195 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.195 "is_configured": false, 00:16:58.195 "data_offset": 2048, 00:16:58.195 "data_size": 63488 00:16:58.195 }, 00:16:58.195 { 00:16:58.195 "name": null, 00:16:58.195 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:58.195 "is_configured": false, 00:16:58.195 "data_offset": 2048, 00:16:58.195 "data_size": 63488 00:16:58.195 } 00:16:58.195 ] 00:16:58.195 }' 00:16:58.195 13:42:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.195 13:42:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.783 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:16:58.783 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:59.043 [2024-07-12 13:42:47.482487] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:59.043 [2024-07-12 13:42:47.482539] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:59.043 [2024-07-12 13:42:47.482562] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x167a8c0 00:16:59.043 [2024-07-12 13:42:47.482580] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:59.043 [2024-07-12 13:42:47.482912] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:59.043 [2024-07-12 13:42:47.482940] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:59.044 [2024-07-12 13:42:47.483003] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:59.044 [2024-07-12 13:42:47.483023] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:59.044 pt2 00:16:59.044 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:59.303 [2024-07-12 13:42:47.731169] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.303 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:59.562 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:59.562 "name": "raid_bdev1", 00:16:59.562 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:16:59.562 "strip_size_kb": 0, 00:16:59.562 "state": "configuring", 00:16:59.562 "raid_level": "raid1", 00:16:59.562 "superblock": true, 00:16:59.562 "num_base_bdevs": 3, 00:16:59.562 "num_base_bdevs_discovered": 1, 00:16:59.562 "num_base_bdevs_operational": 3, 00:16:59.562 "base_bdevs_list": [ 00:16:59.562 { 00:16:59.562 "name": "pt1", 00:16:59.562 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:59.562 "is_configured": true, 00:16:59.562 "data_offset": 2048, 00:16:59.562 "data_size": 63488 00:16:59.562 }, 00:16:59.562 { 00:16:59.562 "name": null, 00:16:59.562 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:59.562 "is_configured": false, 00:16:59.562 "data_offset": 2048, 00:16:59.562 "data_size": 63488 00:16:59.562 }, 00:16:59.562 { 00:16:59.562 "name": null, 00:16:59.562 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:59.562 "is_configured": false, 00:16:59.562 "data_offset": 2048, 00:16:59.562 "data_size": 63488 00:16:59.563 } 00:16:59.563 ] 00:16:59.563 }' 00:16:59.563 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:59.563 13:42:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.132 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:00.132 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:00.132 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:00.393 [2024-07-12 13:42:48.749860] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:00.393 [2024-07-12 13:42:48.749910] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:00.393 [2024-07-12 13:42:48.749939] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1677270 00:17:00.393 [2024-07-12 13:42:48.749957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:00.393 [2024-07-12 13:42:48.750292] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:00.393 [2024-07-12 13:42:48.750314] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:00.393 [2024-07-12 13:42:48.750377] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:00.393 [2024-07-12 13:42:48.750396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:00.393 pt2 00:17:00.393 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:00.393 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:00.393 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:00.653 [2024-07-12 13:42:48.998515] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:00.653 [2024-07-12 13:42:48.998556] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:00.653 [2024-07-12 13:42:48.998573] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1676bb0 00:17:00.653 [2024-07-12 13:42:48.998585] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:00.653 [2024-07-12 13:42:48.998898] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:00.653 [2024-07-12 13:42:48.998915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:00.653 [2024-07-12 13:42:48.998981] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:00.653 [2024-07-12 13:42:48.999000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:00.653 [2024-07-12 13:42:48.999109] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x167a480 00:17:00.653 [2024-07-12 13:42:48.999119] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:00.653 [2024-07-12 13:42:48.999287] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x171d950 00:17:00.653 [2024-07-12 13:42:48.999416] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x167a480 00:17:00.653 [2024-07-12 13:42:48.999426] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x167a480 00:17:00.653 [2024-07-12 13:42:48.999522] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:00.653 pt3 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.653 "name": "raid_bdev1", 00:17:00.653 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:17:00.653 "strip_size_kb": 0, 00:17:00.653 "state": "online", 00:17:00.653 "raid_level": "raid1", 00:17:00.653 "superblock": true, 00:17:00.653 "num_base_bdevs": 3, 00:17:00.653 "num_base_bdevs_discovered": 3, 00:17:00.653 "num_base_bdevs_operational": 3, 00:17:00.653 "base_bdevs_list": [ 00:17:00.653 { 00:17:00.653 "name": "pt1", 00:17:00.653 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:00.653 "is_configured": true, 00:17:00.653 "data_offset": 2048, 00:17:00.653 "data_size": 63488 00:17:00.653 }, 00:17:00.653 { 00:17:00.653 "name": "pt2", 00:17:00.653 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:00.653 "is_configured": true, 00:17:00.653 "data_offset": 2048, 00:17:00.653 "data_size": 63488 00:17:00.653 }, 00:17:00.653 { 00:17:00.653 "name": "pt3", 00:17:00.653 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:00.653 "is_configured": true, 00:17:00.653 "data_offset": 2048, 00:17:00.653 "data_size": 63488 00:17:00.653 } 00:17:00.653 ] 00:17:00.653 }' 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.653 13:42:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.236 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:01.236 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:01.236 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:01.236 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:01.237 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:01.237 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:01.237 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:01.237 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:01.496 [2024-07-12 13:42:50.025525] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:01.496 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:01.496 "name": "raid_bdev1", 00:17:01.496 "aliases": [ 00:17:01.496 "e4ebb042-1bf8-4a49-89d7-e53cb875e37b" 00:17:01.496 ], 00:17:01.496 "product_name": "Raid Volume", 00:17:01.496 "block_size": 512, 00:17:01.496 "num_blocks": 63488, 00:17:01.496 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:17:01.496 "assigned_rate_limits": { 00:17:01.496 "rw_ios_per_sec": 0, 00:17:01.496 "rw_mbytes_per_sec": 0, 00:17:01.496 "r_mbytes_per_sec": 0, 00:17:01.496 "w_mbytes_per_sec": 0 00:17:01.496 }, 00:17:01.496 "claimed": false, 00:17:01.496 "zoned": false, 00:17:01.496 "supported_io_types": { 00:17:01.496 "read": true, 00:17:01.496 "write": true, 00:17:01.496 "unmap": false, 00:17:01.496 "flush": false, 00:17:01.496 "reset": true, 00:17:01.496 "nvme_admin": false, 00:17:01.496 "nvme_io": false, 00:17:01.496 "nvme_io_md": false, 00:17:01.496 "write_zeroes": true, 00:17:01.496 "zcopy": false, 00:17:01.496 "get_zone_info": false, 00:17:01.496 "zone_management": false, 00:17:01.496 "zone_append": false, 00:17:01.496 "compare": false, 00:17:01.496 "compare_and_write": false, 00:17:01.496 "abort": false, 00:17:01.496 "seek_hole": false, 00:17:01.496 "seek_data": false, 00:17:01.496 "copy": false, 00:17:01.496 "nvme_iov_md": false 00:17:01.496 }, 00:17:01.496 "memory_domains": [ 00:17:01.496 { 00:17:01.496 "dma_device_id": "system", 00:17:01.496 "dma_device_type": 1 00:17:01.496 }, 00:17:01.496 { 00:17:01.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.496 "dma_device_type": 2 00:17:01.496 }, 00:17:01.496 { 00:17:01.496 "dma_device_id": "system", 00:17:01.496 "dma_device_type": 1 00:17:01.496 }, 00:17:01.496 { 00:17:01.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.496 "dma_device_type": 2 00:17:01.496 }, 00:17:01.496 { 00:17:01.496 "dma_device_id": "system", 00:17:01.496 "dma_device_type": 1 00:17:01.496 }, 00:17:01.496 { 00:17:01.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.496 "dma_device_type": 2 00:17:01.496 } 00:17:01.496 ], 00:17:01.496 "driver_specific": { 00:17:01.496 "raid": { 00:17:01.496 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:17:01.496 "strip_size_kb": 0, 00:17:01.496 "state": "online", 00:17:01.496 "raid_level": "raid1", 00:17:01.496 "superblock": true, 00:17:01.496 "num_base_bdevs": 3, 00:17:01.496 "num_base_bdevs_discovered": 3, 00:17:01.496 "num_base_bdevs_operational": 3, 00:17:01.496 "base_bdevs_list": [ 00:17:01.496 { 00:17:01.496 "name": "pt1", 00:17:01.496 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:01.496 "is_configured": true, 00:17:01.496 "data_offset": 2048, 00:17:01.496 "data_size": 63488 00:17:01.496 }, 00:17:01.496 { 00:17:01.496 "name": "pt2", 00:17:01.496 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:01.496 "is_configured": true, 00:17:01.496 "data_offset": 2048, 00:17:01.496 "data_size": 63488 00:17:01.496 }, 00:17:01.496 { 00:17:01.496 "name": "pt3", 00:17:01.496 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:01.496 "is_configured": true, 00:17:01.496 "data_offset": 2048, 00:17:01.496 "data_size": 63488 00:17:01.496 } 00:17:01.496 ] 00:17:01.496 } 00:17:01.496 } 00:17:01.496 }' 00:17:01.496 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:01.755 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:01.755 pt2 00:17:01.755 pt3' 00:17:01.755 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:01.755 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:01.755 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.014 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.014 "name": "pt1", 00:17:02.014 "aliases": [ 00:17:02.014 "00000000-0000-0000-0000-000000000001" 00:17:02.014 ], 00:17:02.014 "product_name": "passthru", 00:17:02.014 "block_size": 512, 00:17:02.014 "num_blocks": 65536, 00:17:02.014 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:02.014 "assigned_rate_limits": { 00:17:02.014 "rw_ios_per_sec": 0, 00:17:02.014 "rw_mbytes_per_sec": 0, 00:17:02.014 "r_mbytes_per_sec": 0, 00:17:02.014 "w_mbytes_per_sec": 0 00:17:02.014 }, 00:17:02.014 "claimed": true, 00:17:02.014 "claim_type": "exclusive_write", 00:17:02.014 "zoned": false, 00:17:02.014 "supported_io_types": { 00:17:02.014 "read": true, 00:17:02.014 "write": true, 00:17:02.014 "unmap": true, 00:17:02.014 "flush": true, 00:17:02.014 "reset": true, 00:17:02.014 "nvme_admin": false, 00:17:02.014 "nvme_io": false, 00:17:02.014 "nvme_io_md": false, 00:17:02.014 "write_zeroes": true, 00:17:02.014 "zcopy": true, 00:17:02.014 "get_zone_info": false, 00:17:02.014 "zone_management": false, 00:17:02.014 "zone_append": false, 00:17:02.014 "compare": false, 00:17:02.014 "compare_and_write": false, 00:17:02.014 "abort": true, 00:17:02.014 "seek_hole": false, 00:17:02.014 "seek_data": false, 00:17:02.014 "copy": true, 00:17:02.014 "nvme_iov_md": false 00:17:02.014 }, 00:17:02.014 "memory_domains": [ 00:17:02.014 { 00:17:02.014 "dma_device_id": "system", 00:17:02.014 "dma_device_type": 1 00:17:02.014 }, 00:17:02.014 { 00:17:02.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.014 "dma_device_type": 2 00:17:02.014 } 00:17:02.014 ], 00:17:02.014 "driver_specific": { 00:17:02.014 "passthru": { 00:17:02.014 "name": "pt1", 00:17:02.014 "base_bdev_name": "malloc1" 00:17:02.014 } 00:17:02.014 } 00:17:02.014 }' 00:17:02.014 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.014 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.014 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.014 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.014 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.014 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.014 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.273 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.273 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.273 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.273 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.273 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:02.273 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:02.273 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.273 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:02.532 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.532 "name": "pt2", 00:17:02.532 "aliases": [ 00:17:02.532 "00000000-0000-0000-0000-000000000002" 00:17:02.532 ], 00:17:02.532 "product_name": "passthru", 00:17:02.532 "block_size": 512, 00:17:02.532 "num_blocks": 65536, 00:17:02.532 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:02.532 "assigned_rate_limits": { 00:17:02.532 "rw_ios_per_sec": 0, 00:17:02.532 "rw_mbytes_per_sec": 0, 00:17:02.532 "r_mbytes_per_sec": 0, 00:17:02.532 "w_mbytes_per_sec": 0 00:17:02.532 }, 00:17:02.532 "claimed": true, 00:17:02.532 "claim_type": "exclusive_write", 00:17:02.532 "zoned": false, 00:17:02.532 "supported_io_types": { 00:17:02.532 "read": true, 00:17:02.532 "write": true, 00:17:02.532 "unmap": true, 00:17:02.532 "flush": true, 00:17:02.532 "reset": true, 00:17:02.532 "nvme_admin": false, 00:17:02.532 "nvme_io": false, 00:17:02.532 "nvme_io_md": false, 00:17:02.532 "write_zeroes": true, 00:17:02.532 "zcopy": true, 00:17:02.532 "get_zone_info": false, 00:17:02.532 "zone_management": false, 00:17:02.532 "zone_append": false, 00:17:02.532 "compare": false, 00:17:02.532 "compare_and_write": false, 00:17:02.532 "abort": true, 00:17:02.532 "seek_hole": false, 00:17:02.532 "seek_data": false, 00:17:02.532 "copy": true, 00:17:02.532 "nvme_iov_md": false 00:17:02.532 }, 00:17:02.532 "memory_domains": [ 00:17:02.532 { 00:17:02.532 "dma_device_id": "system", 00:17:02.532 "dma_device_type": 1 00:17:02.532 }, 00:17:02.532 { 00:17:02.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.532 "dma_device_type": 2 00:17:02.532 } 00:17:02.532 ], 00:17:02.532 "driver_specific": { 00:17:02.532 "passthru": { 00:17:02.532 "name": "pt2", 00:17:02.532 "base_bdev_name": "malloc2" 00:17:02.532 } 00:17:02.532 } 00:17:02.532 }' 00:17:02.532 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.532 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.532 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.532 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.791 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.791 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.791 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.791 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.791 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.791 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.791 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.050 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.050 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.050 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:03.050 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.308 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.308 "name": "pt3", 00:17:03.308 "aliases": [ 00:17:03.308 "00000000-0000-0000-0000-000000000003" 00:17:03.308 ], 00:17:03.308 "product_name": "passthru", 00:17:03.308 "block_size": 512, 00:17:03.308 "num_blocks": 65536, 00:17:03.308 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:03.308 "assigned_rate_limits": { 00:17:03.308 "rw_ios_per_sec": 0, 00:17:03.308 "rw_mbytes_per_sec": 0, 00:17:03.308 "r_mbytes_per_sec": 0, 00:17:03.308 "w_mbytes_per_sec": 0 00:17:03.308 }, 00:17:03.308 "claimed": true, 00:17:03.308 "claim_type": "exclusive_write", 00:17:03.308 "zoned": false, 00:17:03.308 "supported_io_types": { 00:17:03.308 "read": true, 00:17:03.308 "write": true, 00:17:03.308 "unmap": true, 00:17:03.308 "flush": true, 00:17:03.308 "reset": true, 00:17:03.308 "nvme_admin": false, 00:17:03.309 "nvme_io": false, 00:17:03.309 "nvme_io_md": false, 00:17:03.309 "write_zeroes": true, 00:17:03.309 "zcopy": true, 00:17:03.309 "get_zone_info": false, 00:17:03.309 "zone_management": false, 00:17:03.309 "zone_append": false, 00:17:03.309 "compare": false, 00:17:03.309 "compare_and_write": false, 00:17:03.309 "abort": true, 00:17:03.309 "seek_hole": false, 00:17:03.309 "seek_data": false, 00:17:03.309 "copy": true, 00:17:03.309 "nvme_iov_md": false 00:17:03.309 }, 00:17:03.309 "memory_domains": [ 00:17:03.309 { 00:17:03.309 "dma_device_id": "system", 00:17:03.309 "dma_device_type": 1 00:17:03.309 }, 00:17:03.309 { 00:17:03.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.309 "dma_device_type": 2 00:17:03.309 } 00:17:03.309 ], 00:17:03.309 "driver_specific": { 00:17:03.309 "passthru": { 00:17:03.309 "name": "pt3", 00:17:03.309 "base_bdev_name": "malloc3" 00:17:03.309 } 00:17:03.309 } 00:17:03.309 }' 00:17:03.309 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.309 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.309 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:03.309 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.309 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.568 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:03.568 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.568 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.568 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.568 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.568 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.568 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.568 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:03.568 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:03.827 [2024-07-12 13:42:52.311580] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:03.827 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' e4ebb042-1bf8-4a49-89d7-e53cb875e37b '!=' e4ebb042-1bf8-4a49-89d7-e53cb875e37b ']' 00:17:03.827 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:03.827 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:03.827 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:03.827 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:04.396 [2024-07-12 13:42:52.812688] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.396 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:04.963 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.963 "name": "raid_bdev1", 00:17:04.963 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:17:04.963 "strip_size_kb": 0, 00:17:04.963 "state": "online", 00:17:04.963 "raid_level": "raid1", 00:17:04.963 "superblock": true, 00:17:04.963 "num_base_bdevs": 3, 00:17:04.963 "num_base_bdevs_discovered": 2, 00:17:04.964 "num_base_bdevs_operational": 2, 00:17:04.964 "base_bdevs_list": [ 00:17:04.964 { 00:17:04.964 "name": null, 00:17:04.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.964 "is_configured": false, 00:17:04.964 "data_offset": 2048, 00:17:04.964 "data_size": 63488 00:17:04.964 }, 00:17:04.964 { 00:17:04.964 "name": "pt2", 00:17:04.964 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:04.964 "is_configured": true, 00:17:04.964 "data_offset": 2048, 00:17:04.964 "data_size": 63488 00:17:04.964 }, 00:17:04.964 { 00:17:04.964 "name": "pt3", 00:17:04.964 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:04.964 "is_configured": true, 00:17:04.964 "data_offset": 2048, 00:17:04.964 "data_size": 63488 00:17:04.964 } 00:17:04.964 ] 00:17:04.964 }' 00:17:04.964 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.964 13:42:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.532 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:05.791 [2024-07-12 13:42:54.208348] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:05.791 [2024-07-12 13:42:54.208375] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:05.791 [2024-07-12 13:42:54.208424] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:05.791 [2024-07-12 13:42:54.208487] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:05.791 [2024-07-12 13:42:54.208499] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x167a480 name raid_bdev1, state offline 00:17:05.791 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.791 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:06.050 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:06.050 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:06.050 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:06.050 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:06.050 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:06.309 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:06.309 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:06.309 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:06.568 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:06.568 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:06.568 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:06.568 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:06.568 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:06.826 [2024-07-12 13:42:55.210974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:06.826 [2024-07-12 13:42:55.211017] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.826 [2024-07-12 13:42:55.211035] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1676940 00:17:06.826 [2024-07-12 13:42:55.211047] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.826 [2024-07-12 13:42:55.212640] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.826 [2024-07-12 13:42:55.212668] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:06.827 [2024-07-12 13:42:55.212732] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:06.827 [2024-07-12 13:42:55.212757] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:06.827 pt2 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.827 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:07.085 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:07.085 "name": "raid_bdev1", 00:17:07.085 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:17:07.085 "strip_size_kb": 0, 00:17:07.085 "state": "configuring", 00:17:07.085 "raid_level": "raid1", 00:17:07.085 "superblock": true, 00:17:07.085 "num_base_bdevs": 3, 00:17:07.085 "num_base_bdevs_discovered": 1, 00:17:07.085 "num_base_bdevs_operational": 2, 00:17:07.085 "base_bdevs_list": [ 00:17:07.085 { 00:17:07.085 "name": null, 00:17:07.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.085 "is_configured": false, 00:17:07.085 "data_offset": 2048, 00:17:07.085 "data_size": 63488 00:17:07.085 }, 00:17:07.085 { 00:17:07.085 "name": "pt2", 00:17:07.086 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:07.086 "is_configured": true, 00:17:07.086 "data_offset": 2048, 00:17:07.086 "data_size": 63488 00:17:07.086 }, 00:17:07.086 { 00:17:07.086 "name": null, 00:17:07.086 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:07.086 "is_configured": false, 00:17:07.086 "data_offset": 2048, 00:17:07.086 "data_size": 63488 00:17:07.086 } 00:17:07.086 ] 00:17:07.086 }' 00:17:07.086 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:07.086 13:42:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.654 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:07.654 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:07.654 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:17:07.654 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:07.913 [2024-07-12 13:42:56.313892] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:07.913 [2024-07-12 13:42:56.313947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:07.913 [2024-07-12 13:42:56.313966] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1716450 00:17:07.913 [2024-07-12 13:42:56.313979] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:07.913 [2024-07-12 13:42:56.314315] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:07.913 [2024-07-12 13:42:56.314332] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:07.913 [2024-07-12 13:42:56.314393] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:07.913 [2024-07-12 13:42:56.314413] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:07.913 [2024-07-12 13:42:56.314513] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x171b0f0 00:17:07.913 [2024-07-12 13:42:56.314525] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:07.913 [2024-07-12 13:42:56.314691] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x171d010 00:17:07.913 [2024-07-12 13:42:56.314815] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x171b0f0 00:17:07.913 [2024-07-12 13:42:56.314825] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x171b0f0 00:17:07.913 [2024-07-12 13:42:56.314921] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:07.913 pt3 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.913 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:08.173 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.173 "name": "raid_bdev1", 00:17:08.173 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:17:08.173 "strip_size_kb": 0, 00:17:08.173 "state": "online", 00:17:08.173 "raid_level": "raid1", 00:17:08.173 "superblock": true, 00:17:08.173 "num_base_bdevs": 3, 00:17:08.173 "num_base_bdevs_discovered": 2, 00:17:08.173 "num_base_bdevs_operational": 2, 00:17:08.173 "base_bdevs_list": [ 00:17:08.173 { 00:17:08.173 "name": null, 00:17:08.173 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.173 "is_configured": false, 00:17:08.173 "data_offset": 2048, 00:17:08.173 "data_size": 63488 00:17:08.173 }, 00:17:08.173 { 00:17:08.173 "name": "pt2", 00:17:08.173 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:08.173 "is_configured": true, 00:17:08.173 "data_offset": 2048, 00:17:08.173 "data_size": 63488 00:17:08.173 }, 00:17:08.173 { 00:17:08.173 "name": "pt3", 00:17:08.173 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:08.173 "is_configured": true, 00:17:08.173 "data_offset": 2048, 00:17:08.173 "data_size": 63488 00:17:08.173 } 00:17:08.173 ] 00:17:08.173 }' 00:17:08.173 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.173 13:42:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:08.741 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:09.000 [2024-07-12 13:42:57.340585] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:09.000 [2024-07-12 13:42:57.340607] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:09.000 [2024-07-12 13:42:57.340656] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:09.000 [2024-07-12 13:42:57.340708] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:09.000 [2024-07-12 13:42:57.340719] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x171b0f0 name raid_bdev1, state offline 00:17:09.000 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.000 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:09.259 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:09.259 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:09.259 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:17:09.259 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:17:09.259 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:09.519 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:09.519 [2024-07-12 13:42:58.094552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:09.519 [2024-07-12 13:42:58.094595] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:09.519 [2024-07-12 13:42:58.094611] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1678ab0 00:17:09.519 [2024-07-12 13:42:58.094624] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:09.519 [2024-07-12 13:42:58.096204] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:09.519 [2024-07-12 13:42:58.096231] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:09.519 [2024-07-12 13:42:58.096292] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:09.519 [2024-07-12 13:42:58.096317] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:09.519 [2024-07-12 13:42:58.096417] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:09.519 [2024-07-12 13:42:58.096430] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:09.519 [2024-07-12 13:42:58.096444] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x171cb80 name raid_bdev1, state configuring 00:17:09.519 [2024-07-12 13:42:58.096467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:09.519 pt1 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.778 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:10.037 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:10.037 "name": "raid_bdev1", 00:17:10.037 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:17:10.037 "strip_size_kb": 0, 00:17:10.037 "state": "configuring", 00:17:10.037 "raid_level": "raid1", 00:17:10.037 "superblock": true, 00:17:10.037 "num_base_bdevs": 3, 00:17:10.037 "num_base_bdevs_discovered": 1, 00:17:10.037 "num_base_bdevs_operational": 2, 00:17:10.037 "base_bdevs_list": [ 00:17:10.037 { 00:17:10.037 "name": null, 00:17:10.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:10.037 "is_configured": false, 00:17:10.037 "data_offset": 2048, 00:17:10.037 "data_size": 63488 00:17:10.037 }, 00:17:10.037 { 00:17:10.037 "name": "pt2", 00:17:10.037 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:10.037 "is_configured": true, 00:17:10.037 "data_offset": 2048, 00:17:10.037 "data_size": 63488 00:17:10.037 }, 00:17:10.037 { 00:17:10.037 "name": null, 00:17:10.037 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:10.037 "is_configured": false, 00:17:10.037 "data_offset": 2048, 00:17:10.037 "data_size": 63488 00:17:10.037 } 00:17:10.037 ] 00:17:10.037 }' 00:17:10.037 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:10.037 13:42:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.975 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:10.975 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:10.975 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:17:10.975 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:11.234 [2024-07-12 13:42:59.742929] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:11.234 [2024-07-12 13:42:59.742980] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:11.234 [2024-07-12 13:42:59.742999] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x167a160 00:17:11.234 [2024-07-12 13:42:59.743012] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:11.234 [2024-07-12 13:42:59.743359] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:11.234 [2024-07-12 13:42:59.743382] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:11.234 [2024-07-12 13:42:59.743445] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:11.234 [2024-07-12 13:42:59.743465] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:11.234 [2024-07-12 13:42:59.743562] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17161e0 00:17:11.234 [2024-07-12 13:42:59.743573] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:11.234 [2024-07-12 13:42:59.743735] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1679190 00:17:11.234 [2024-07-12 13:42:59.743857] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17161e0 00:17:11.234 [2024-07-12 13:42:59.743867] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17161e0 00:17:11.234 [2024-07-12 13:42:59.743971] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:11.234 pt3 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.234 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:11.803 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.803 "name": "raid_bdev1", 00:17:11.803 "uuid": "e4ebb042-1bf8-4a49-89d7-e53cb875e37b", 00:17:11.803 "strip_size_kb": 0, 00:17:11.803 "state": "online", 00:17:11.803 "raid_level": "raid1", 00:17:11.803 "superblock": true, 00:17:11.803 "num_base_bdevs": 3, 00:17:11.803 "num_base_bdevs_discovered": 2, 00:17:11.803 "num_base_bdevs_operational": 2, 00:17:11.803 "base_bdevs_list": [ 00:17:11.803 { 00:17:11.803 "name": null, 00:17:11.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.803 "is_configured": false, 00:17:11.803 "data_offset": 2048, 00:17:11.803 "data_size": 63488 00:17:11.803 }, 00:17:11.803 { 00:17:11.803 "name": "pt2", 00:17:11.803 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:11.803 "is_configured": true, 00:17:11.803 "data_offset": 2048, 00:17:11.803 "data_size": 63488 00:17:11.803 }, 00:17:11.803 { 00:17:11.803 "name": "pt3", 00:17:11.803 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:11.803 "is_configured": true, 00:17:11.803 "data_offset": 2048, 00:17:11.803 "data_size": 63488 00:17:11.803 } 00:17:11.803 ] 00:17:11.803 }' 00:17:11.803 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.803 13:43:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:12.741 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:12.741 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:13.000 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:13.000 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:13.000 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:13.569 [2024-07-12 13:43:01.912970] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' e4ebb042-1bf8-4a49-89d7-e53cb875e37b '!=' e4ebb042-1bf8-4a49-89d7-e53cb875e37b ']' 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 483693 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 483693 ']' 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 483693 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 483693 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 483693' 00:17:13.569 killing process with pid 483693 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 483693 00:17:13.569 [2024-07-12 13:43:01.995067] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:13.569 [2024-07-12 13:43:01.995118] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:13.569 [2024-07-12 13:43:01.995173] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:13.569 [2024-07-12 13:43:01.995186] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17161e0 name raid_bdev1, state offline 00:17:13.569 13:43:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 483693 00:17:13.569 [2024-07-12 13:43:02.025461] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:13.829 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:13.829 00:17:13.829 real 0m23.860s 00:17:13.829 user 0m43.774s 00:17:13.829 sys 0m4.112s 00:17:13.829 13:43:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:13.829 13:43:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:13.829 ************************************ 00:17:13.829 END TEST raid_superblock_test 00:17:13.829 ************************************ 00:17:13.829 13:43:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:13.829 13:43:02 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:17:13.829 13:43:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:13.829 13:43:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:13.829 13:43:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:13.829 ************************************ 00:17:13.829 START TEST raid_read_error_test 00:17:13.829 ************************************ 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Ynhqbl923H 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=487292 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 487292 /var/tmp/spdk-raid.sock 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 487292 ']' 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:13.829 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:13.829 13:43:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:14.088 [2024-07-12 13:43:02.465132] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:17:14.088 [2024-07-12 13:43:02.465268] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid487292 ] 00:17:14.088 [2024-07-12 13:43:02.659741] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:14.346 [2024-07-12 13:43:02.757716] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:14.346 [2024-07-12 13:43:02.818328] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:14.346 [2024-07-12 13:43:02.818357] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:14.916 13:43:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:14.916 13:43:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:14.916 13:43:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:14.916 13:43:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:15.176 BaseBdev1_malloc 00:17:15.176 13:43:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:15.434 true 00:17:15.434 13:43:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:16.000 [2024-07-12 13:43:04.477728] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:16.000 [2024-07-12 13:43:04.477778] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:16.000 [2024-07-12 13:43:04.477800] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcdfa10 00:17:16.000 [2024-07-12 13:43:04.477813] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:16.000 [2024-07-12 13:43:04.479744] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:16.000 [2024-07-12 13:43:04.479774] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:16.000 BaseBdev1 00:17:16.000 13:43:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:16.000 13:43:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:16.258 BaseBdev2_malloc 00:17:16.258 13:43:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:16.825 true 00:17:16.825 13:43:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:17.082 [2024-07-12 13:43:05.594489] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:17.082 [2024-07-12 13:43:05.594535] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:17.082 [2024-07-12 13:43:05.594556] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xce4250 00:17:17.082 [2024-07-12 13:43:05.594568] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:17.082 [2024-07-12 13:43:05.596182] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:17.082 [2024-07-12 13:43:05.596209] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:17.082 BaseBdev2 00:17:17.082 13:43:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:17.082 13:43:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:17.650 BaseBdev3_malloc 00:17:17.650 13:43:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:17.909 true 00:17:17.909 13:43:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:18.477 [2024-07-12 13:43:06.911755] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:18.477 [2024-07-12 13:43:06.911799] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:18.477 [2024-07-12 13:43:06.911819] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xce6510 00:17:18.477 [2024-07-12 13:43:06.911832] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:18.477 [2024-07-12 13:43:06.913434] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:18.477 [2024-07-12 13:43:06.913463] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:18.477 BaseBdev3 00:17:18.477 13:43:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:18.736 [2024-07-12 13:43:07.212570] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:18.736 [2024-07-12 13:43:07.213909] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:18.736 [2024-07-12 13:43:07.213987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:18.736 [2024-07-12 13:43:07.214199] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xce7bc0 00:17:18.736 [2024-07-12 13:43:07.214212] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:18.736 [2024-07-12 13:43:07.214412] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xce7760 00:17:18.736 [2024-07-12 13:43:07.214562] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xce7bc0 00:17:18.736 [2024-07-12 13:43:07.214572] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xce7bc0 00:17:18.736 [2024-07-12 13:43:07.214676] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.736 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:19.305 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.305 "name": "raid_bdev1", 00:17:19.305 "uuid": "50b56af5-c9e2-42e7-b812-8ffa3de81ed4", 00:17:19.305 "strip_size_kb": 0, 00:17:19.305 "state": "online", 00:17:19.305 "raid_level": "raid1", 00:17:19.305 "superblock": true, 00:17:19.305 "num_base_bdevs": 3, 00:17:19.305 "num_base_bdevs_discovered": 3, 00:17:19.305 "num_base_bdevs_operational": 3, 00:17:19.305 "base_bdevs_list": [ 00:17:19.305 { 00:17:19.305 "name": "BaseBdev1", 00:17:19.305 "uuid": "73d77701-f2e2-5e6a-af2a-8fff797cffe5", 00:17:19.305 "is_configured": true, 00:17:19.305 "data_offset": 2048, 00:17:19.305 "data_size": 63488 00:17:19.305 }, 00:17:19.305 { 00:17:19.305 "name": "BaseBdev2", 00:17:19.305 "uuid": "d44cc439-0c7e-56b2-a8ce-86321200327e", 00:17:19.305 "is_configured": true, 00:17:19.305 "data_offset": 2048, 00:17:19.305 "data_size": 63488 00:17:19.305 }, 00:17:19.305 { 00:17:19.305 "name": "BaseBdev3", 00:17:19.305 "uuid": "0bfe3ba3-7397-57f3-81a4-511a10e55743", 00:17:19.305 "is_configured": true, 00:17:19.305 "data_offset": 2048, 00:17:19.305 "data_size": 63488 00:17:19.305 } 00:17:19.305 ] 00:17:19.305 }' 00:17:19.305 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.305 13:43:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.243 13:43:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:20.243 13:43:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:20.243 [2024-07-12 13:43:08.768992] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb35740 00:17:21.180 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.439 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:21.698 13:43:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.698 "name": "raid_bdev1", 00:17:21.698 "uuid": "50b56af5-c9e2-42e7-b812-8ffa3de81ed4", 00:17:21.698 "strip_size_kb": 0, 00:17:21.698 "state": "online", 00:17:21.699 "raid_level": "raid1", 00:17:21.699 "superblock": true, 00:17:21.699 "num_base_bdevs": 3, 00:17:21.699 "num_base_bdevs_discovered": 3, 00:17:21.699 "num_base_bdevs_operational": 3, 00:17:21.699 "base_bdevs_list": [ 00:17:21.699 { 00:17:21.699 "name": "BaseBdev1", 00:17:21.699 "uuid": "73d77701-f2e2-5e6a-af2a-8fff797cffe5", 00:17:21.699 "is_configured": true, 00:17:21.699 "data_offset": 2048, 00:17:21.699 "data_size": 63488 00:17:21.699 }, 00:17:21.699 { 00:17:21.699 "name": "BaseBdev2", 00:17:21.699 "uuid": "d44cc439-0c7e-56b2-a8ce-86321200327e", 00:17:21.699 "is_configured": true, 00:17:21.699 "data_offset": 2048, 00:17:21.699 "data_size": 63488 00:17:21.699 }, 00:17:21.699 { 00:17:21.699 "name": "BaseBdev3", 00:17:21.699 "uuid": "0bfe3ba3-7397-57f3-81a4-511a10e55743", 00:17:21.699 "is_configured": true, 00:17:21.699 "data_offset": 2048, 00:17:21.699 "data_size": 63488 00:17:21.699 } 00:17:21.699 ] 00:17:21.699 }' 00:17:21.699 13:43:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.699 13:43:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.267 13:43:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:22.525 [2024-07-12 13:43:11.025087] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:22.525 [2024-07-12 13:43:11.025131] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:22.525 [2024-07-12 13:43:11.028322] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:22.525 [2024-07-12 13:43:11.028355] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:22.525 [2024-07-12 13:43:11.028451] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:22.526 [2024-07-12 13:43:11.028463] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xce7bc0 name raid_bdev1, state offline 00:17:22.526 0 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 487292 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 487292 ']' 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 487292 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 487292 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 487292' 00:17:22.526 killing process with pid 487292 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 487292 00:17:22.526 [2024-07-12 13:43:11.106836] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:22.526 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 487292 00:17:22.784 [2024-07-12 13:43:11.128074] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:22.785 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Ynhqbl923H 00:17:22.785 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:22.785 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:22.785 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:22.785 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:22.785 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:22.785 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:22.785 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:22.785 00:17:22.785 real 0m9.025s 00:17:22.785 user 0m14.851s 00:17:22.785 sys 0m1.508s 00:17:22.785 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:22.785 13:43:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.785 ************************************ 00:17:22.785 END TEST raid_read_error_test 00:17:22.785 ************************************ 00:17:23.058 13:43:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:23.058 13:43:11 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:17:23.058 13:43:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:23.058 13:43:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:23.058 13:43:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:23.058 ************************************ 00:17:23.058 START TEST raid_write_error_test 00:17:23.058 ************************************ 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.qD5ifFUgqo 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=488614 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 488614 /var/tmp/spdk-raid.sock 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 488614 ']' 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:23.058 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:23.058 13:43:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:23.058 [2024-07-12 13:43:11.537137] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:17:23.058 [2024-07-12 13:43:11.537207] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid488614 ] 00:17:23.316 [2024-07-12 13:43:11.666603] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:23.316 [2024-07-12 13:43:11.768359] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:23.316 [2024-07-12 13:43:11.834650] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:23.316 [2024-07-12 13:43:11.834697] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:24.246 13:43:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:24.246 13:43:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:24.246 13:43:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:24.246 13:43:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:24.503 BaseBdev1_malloc 00:17:24.503 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:24.806 true 00:17:24.806 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:25.372 [2024-07-12 13:43:13.810231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:25.372 [2024-07-12 13:43:13.810279] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:25.372 [2024-07-12 13:43:13.810305] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x111aa10 00:17:25.372 [2024-07-12 13:43:13.810318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:25.372 [2024-07-12 13:43:13.812247] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:25.372 [2024-07-12 13:43:13.812277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:25.372 BaseBdev1 00:17:25.372 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:25.372 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:25.629 BaseBdev2_malloc 00:17:25.629 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:25.887 true 00:17:25.887 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:26.454 [2024-07-12 13:43:14.914969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:26.454 [2024-07-12 13:43:14.915016] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:26.454 [2024-07-12 13:43:14.915039] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x111f250 00:17:26.454 [2024-07-12 13:43:14.915052] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:26.454 [2024-07-12 13:43:14.916636] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:26.454 [2024-07-12 13:43:14.916664] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:26.454 BaseBdev2 00:17:26.454 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:26.454 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:26.713 BaseBdev3_malloc 00:17:26.713 13:43:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:26.973 true 00:17:26.973 13:43:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:27.541 [2024-07-12 13:43:16.015714] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:27.541 [2024-07-12 13:43:16.015759] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:27.542 [2024-07-12 13:43:16.015783] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1121510 00:17:27.542 [2024-07-12 13:43:16.015796] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:27.542 [2024-07-12 13:43:16.017407] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:27.542 [2024-07-12 13:43:16.017435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:27.542 BaseBdev3 00:17:27.542 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:27.801 [2024-07-12 13:43:16.320535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:27.801 [2024-07-12 13:43:16.321865] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:27.801 [2024-07-12 13:43:16.321941] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:27.801 [2024-07-12 13:43:16.322150] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1122bc0 00:17:27.801 [2024-07-12 13:43:16.322162] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:27.801 [2024-07-12 13:43:16.322356] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1122760 00:17:27.801 [2024-07-12 13:43:16.322507] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1122bc0 00:17:27.801 [2024-07-12 13:43:16.322517] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1122bc0 00:17:27.801 [2024-07-12 13:43:16.322623] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.801 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:28.368 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.368 "name": "raid_bdev1", 00:17:28.369 "uuid": "f7f66716-376e-4b66-805f-926a00212fde", 00:17:28.369 "strip_size_kb": 0, 00:17:28.369 "state": "online", 00:17:28.369 "raid_level": "raid1", 00:17:28.369 "superblock": true, 00:17:28.369 "num_base_bdevs": 3, 00:17:28.369 "num_base_bdevs_discovered": 3, 00:17:28.369 "num_base_bdevs_operational": 3, 00:17:28.369 "base_bdevs_list": [ 00:17:28.369 { 00:17:28.369 "name": "BaseBdev1", 00:17:28.369 "uuid": "20a01542-4328-5e10-8303-6382cdd6ca9a", 00:17:28.369 "is_configured": true, 00:17:28.369 "data_offset": 2048, 00:17:28.369 "data_size": 63488 00:17:28.369 }, 00:17:28.369 { 00:17:28.369 "name": "BaseBdev2", 00:17:28.369 "uuid": "3db762d2-df7a-59a7-947a-1c380cece44a", 00:17:28.369 "is_configured": true, 00:17:28.369 "data_offset": 2048, 00:17:28.369 "data_size": 63488 00:17:28.369 }, 00:17:28.369 { 00:17:28.369 "name": "BaseBdev3", 00:17:28.369 "uuid": "61f1aaac-4226-5627-a6e0-3bb050a605f5", 00:17:28.369 "is_configured": true, 00:17:28.369 "data_offset": 2048, 00:17:28.369 "data_size": 63488 00:17:28.369 } 00:17:28.369 ] 00:17:28.369 }' 00:17:28.369 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.369 13:43:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.936 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:28.936 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:29.195 [2024-07-12 13:43:17.560101] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf70740 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:30.133 [2024-07-12 13:43:18.684219] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:30.133 [2024-07-12 13:43:18.684278] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:30.133 [2024-07-12 13:43:18.684473] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xf70740 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.133 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:30.392 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.392 "name": "raid_bdev1", 00:17:30.392 "uuid": "f7f66716-376e-4b66-805f-926a00212fde", 00:17:30.392 "strip_size_kb": 0, 00:17:30.392 "state": "online", 00:17:30.392 "raid_level": "raid1", 00:17:30.392 "superblock": true, 00:17:30.392 "num_base_bdevs": 3, 00:17:30.392 "num_base_bdevs_discovered": 2, 00:17:30.392 "num_base_bdevs_operational": 2, 00:17:30.392 "base_bdevs_list": [ 00:17:30.392 { 00:17:30.392 "name": null, 00:17:30.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:30.392 "is_configured": false, 00:17:30.392 "data_offset": 2048, 00:17:30.392 "data_size": 63488 00:17:30.392 }, 00:17:30.392 { 00:17:30.392 "name": "BaseBdev2", 00:17:30.392 "uuid": "3db762d2-df7a-59a7-947a-1c380cece44a", 00:17:30.392 "is_configured": true, 00:17:30.392 "data_offset": 2048, 00:17:30.392 "data_size": 63488 00:17:30.392 }, 00:17:30.392 { 00:17:30.392 "name": "BaseBdev3", 00:17:30.392 "uuid": "61f1aaac-4226-5627-a6e0-3bb050a605f5", 00:17:30.392 "is_configured": true, 00:17:30.392 "data_offset": 2048, 00:17:30.392 "data_size": 63488 00:17:30.392 } 00:17:30.392 ] 00:17:30.392 }' 00:17:30.392 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.392 13:43:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.330 13:43:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:31.330 [2024-07-12 13:43:19.844524] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:31.330 [2024-07-12 13:43:19.844561] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:31.330 [2024-07-12 13:43:19.847703] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:31.330 [2024-07-12 13:43:19.847733] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:31.330 [2024-07-12 13:43:19.847806] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:31.330 [2024-07-12 13:43:19.847817] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1122bc0 name raid_bdev1, state offline 00:17:31.330 0 00:17:31.330 13:43:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 488614 00:17:31.330 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 488614 ']' 00:17:31.330 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 488614 00:17:31.330 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:31.330 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:31.330 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 488614 00:17:31.589 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:31.589 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:31.589 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 488614' 00:17:31.589 killing process with pid 488614 00:17:31.589 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 488614 00:17:31.589 [2024-07-12 13:43:19.916281] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:31.589 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 488614 00:17:31.589 [2024-07-12 13:43:19.940614] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:31.849 13:43:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.qD5ifFUgqo 00:17:31.849 13:43:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:31.849 13:43:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:31.849 13:43:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:31.849 13:43:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:31.849 13:43:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:31.849 13:43:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:31.849 13:43:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:31.849 00:17:31.849 real 0m8.729s 00:17:31.849 user 0m14.320s 00:17:31.849 sys 0m1.426s 00:17:31.849 13:43:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:31.849 13:43:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.849 ************************************ 00:17:31.849 END TEST raid_write_error_test 00:17:31.849 ************************************ 00:17:31.849 13:43:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:31.849 13:43:20 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:17:31.849 13:43:20 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:31.849 13:43:20 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:17:31.849 13:43:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:31.849 13:43:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:31.849 13:43:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:31.849 ************************************ 00:17:31.849 START TEST raid_state_function_test 00:17:31.849 ************************************ 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=489775 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 489775' 00:17:31.849 Process raid pid: 489775 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 489775 /var/tmp/spdk-raid.sock 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 489775 ']' 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:31.849 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:31.849 13:43:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.849 [2024-07-12 13:43:20.338205] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:17:31.849 [2024-07-12 13:43:20.338258] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:32.109 [2024-07-12 13:43:20.449976] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:32.109 [2024-07-12 13:43:20.551575] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:32.109 [2024-07-12 13:43:20.612516] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:32.109 [2024-07-12 13:43:20.612553] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:32.760 13:43:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:32.760 13:43:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:32.760 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:33.054 [2024-07-12 13:43:21.433194] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:33.054 [2024-07-12 13:43:21.433239] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:33.054 [2024-07-12 13:43:21.433250] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:33.054 [2024-07-12 13:43:21.433261] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:33.054 [2024-07-12 13:43:21.433270] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:33.054 [2024-07-12 13:43:21.433282] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:33.054 [2024-07-12 13:43:21.433290] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:33.054 [2024-07-12 13:43:21.433301] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:33.054 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.335 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.335 "name": "Existed_Raid", 00:17:33.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.335 "strip_size_kb": 64, 00:17:33.335 "state": "configuring", 00:17:33.335 "raid_level": "raid0", 00:17:33.335 "superblock": false, 00:17:33.335 "num_base_bdevs": 4, 00:17:33.335 "num_base_bdevs_discovered": 0, 00:17:33.335 "num_base_bdevs_operational": 4, 00:17:33.335 "base_bdevs_list": [ 00:17:33.335 { 00:17:33.335 "name": "BaseBdev1", 00:17:33.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.335 "is_configured": false, 00:17:33.335 "data_offset": 0, 00:17:33.335 "data_size": 0 00:17:33.335 }, 00:17:33.335 { 00:17:33.335 "name": "BaseBdev2", 00:17:33.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.335 "is_configured": false, 00:17:33.335 "data_offset": 0, 00:17:33.335 "data_size": 0 00:17:33.335 }, 00:17:33.335 { 00:17:33.335 "name": "BaseBdev3", 00:17:33.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.335 "is_configured": false, 00:17:33.335 "data_offset": 0, 00:17:33.335 "data_size": 0 00:17:33.335 }, 00:17:33.335 { 00:17:33.335 "name": "BaseBdev4", 00:17:33.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.335 "is_configured": false, 00:17:33.335 "data_offset": 0, 00:17:33.335 "data_size": 0 00:17:33.335 } 00:17:33.335 ] 00:17:33.335 }' 00:17:33.335 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.335 13:43:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.952 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:34.211 [2024-07-12 13:43:22.564083] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:34.211 [2024-07-12 13:43:22.564121] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cf8370 name Existed_Raid, state configuring 00:17:34.211 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:34.471 [2024-07-12 13:43:22.808742] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:34.471 [2024-07-12 13:43:22.808779] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:34.471 [2024-07-12 13:43:22.808790] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:34.471 [2024-07-12 13:43:22.808801] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:34.471 [2024-07-12 13:43:22.808810] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:34.471 [2024-07-12 13:43:22.808821] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:34.471 [2024-07-12 13:43:22.808830] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:34.471 [2024-07-12 13:43:22.808841] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:34.471 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:34.729 [2024-07-12 13:43:23.075255] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:34.729 BaseBdev1 00:17:34.729 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:34.729 13:43:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:34.729 13:43:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:34.729 13:43:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:34.729 13:43:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:34.729 13:43:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:34.729 13:43:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:34.989 13:43:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:34.989 [ 00:17:34.989 { 00:17:34.989 "name": "BaseBdev1", 00:17:34.989 "aliases": [ 00:17:34.989 "4957fce9-682b-462d-acb7-ff3a5fe30b1c" 00:17:34.989 ], 00:17:34.989 "product_name": "Malloc disk", 00:17:34.989 "block_size": 512, 00:17:34.989 "num_blocks": 65536, 00:17:34.989 "uuid": "4957fce9-682b-462d-acb7-ff3a5fe30b1c", 00:17:34.989 "assigned_rate_limits": { 00:17:34.989 "rw_ios_per_sec": 0, 00:17:34.989 "rw_mbytes_per_sec": 0, 00:17:34.989 "r_mbytes_per_sec": 0, 00:17:34.989 "w_mbytes_per_sec": 0 00:17:34.989 }, 00:17:34.989 "claimed": true, 00:17:34.989 "claim_type": "exclusive_write", 00:17:34.989 "zoned": false, 00:17:34.989 "supported_io_types": { 00:17:34.989 "read": true, 00:17:34.989 "write": true, 00:17:34.989 "unmap": true, 00:17:34.989 "flush": true, 00:17:34.989 "reset": true, 00:17:34.989 "nvme_admin": false, 00:17:34.989 "nvme_io": false, 00:17:34.989 "nvme_io_md": false, 00:17:34.989 "write_zeroes": true, 00:17:34.989 "zcopy": true, 00:17:34.989 "get_zone_info": false, 00:17:34.989 "zone_management": false, 00:17:34.989 "zone_append": false, 00:17:34.989 "compare": false, 00:17:34.989 "compare_and_write": false, 00:17:34.989 "abort": true, 00:17:34.989 "seek_hole": false, 00:17:34.989 "seek_data": false, 00:17:34.989 "copy": true, 00:17:34.989 "nvme_iov_md": false 00:17:34.989 }, 00:17:34.989 "memory_domains": [ 00:17:34.989 { 00:17:34.989 "dma_device_id": "system", 00:17:34.989 "dma_device_type": 1 00:17:34.989 }, 00:17:34.989 { 00:17:34.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:34.989 "dma_device_type": 2 00:17:34.989 } 00:17:34.989 ], 00:17:34.989 "driver_specific": {} 00:17:34.989 } 00:17:34.989 ] 00:17:35.247 13:43:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:35.247 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:35.247 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:35.247 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:35.247 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:35.247 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:35.247 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:35.247 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.248 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.248 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.248 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.248 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.248 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.506 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.506 "name": "Existed_Raid", 00:17:35.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.506 "strip_size_kb": 64, 00:17:35.506 "state": "configuring", 00:17:35.506 "raid_level": "raid0", 00:17:35.506 "superblock": false, 00:17:35.506 "num_base_bdevs": 4, 00:17:35.506 "num_base_bdevs_discovered": 1, 00:17:35.506 "num_base_bdevs_operational": 4, 00:17:35.506 "base_bdevs_list": [ 00:17:35.506 { 00:17:35.506 "name": "BaseBdev1", 00:17:35.506 "uuid": "4957fce9-682b-462d-acb7-ff3a5fe30b1c", 00:17:35.506 "is_configured": true, 00:17:35.506 "data_offset": 0, 00:17:35.506 "data_size": 65536 00:17:35.506 }, 00:17:35.506 { 00:17:35.506 "name": "BaseBdev2", 00:17:35.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.506 "is_configured": false, 00:17:35.506 "data_offset": 0, 00:17:35.506 "data_size": 0 00:17:35.506 }, 00:17:35.506 { 00:17:35.506 "name": "BaseBdev3", 00:17:35.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.506 "is_configured": false, 00:17:35.506 "data_offset": 0, 00:17:35.506 "data_size": 0 00:17:35.506 }, 00:17:35.506 { 00:17:35.506 "name": "BaseBdev4", 00:17:35.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.506 "is_configured": false, 00:17:35.506 "data_offset": 0, 00:17:35.506 "data_size": 0 00:17:35.506 } 00:17:35.506 ] 00:17:35.506 }' 00:17:35.506 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.506 13:43:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:36.073 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:36.073 [2024-07-12 13:43:24.631403] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:36.073 [2024-07-12 13:43:24.631446] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cf7be0 name Existed_Raid, state configuring 00:17:36.073 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:36.332 [2024-07-12 13:43:24.872075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:36.332 [2024-07-12 13:43:24.873599] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:36.332 [2024-07-12 13:43:24.873636] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:36.332 [2024-07-12 13:43:24.873647] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:36.332 [2024-07-12 13:43:24.873658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:36.332 [2024-07-12 13:43:24.873668] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:36.332 [2024-07-12 13:43:24.873679] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.332 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.590 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.590 "name": "Existed_Raid", 00:17:36.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.590 "strip_size_kb": 64, 00:17:36.590 "state": "configuring", 00:17:36.590 "raid_level": "raid0", 00:17:36.590 "superblock": false, 00:17:36.590 "num_base_bdevs": 4, 00:17:36.590 "num_base_bdevs_discovered": 1, 00:17:36.590 "num_base_bdevs_operational": 4, 00:17:36.590 "base_bdevs_list": [ 00:17:36.590 { 00:17:36.590 "name": "BaseBdev1", 00:17:36.590 "uuid": "4957fce9-682b-462d-acb7-ff3a5fe30b1c", 00:17:36.590 "is_configured": true, 00:17:36.590 "data_offset": 0, 00:17:36.590 "data_size": 65536 00:17:36.590 }, 00:17:36.590 { 00:17:36.590 "name": "BaseBdev2", 00:17:36.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.590 "is_configured": false, 00:17:36.590 "data_offset": 0, 00:17:36.590 "data_size": 0 00:17:36.590 }, 00:17:36.591 { 00:17:36.591 "name": "BaseBdev3", 00:17:36.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.591 "is_configured": false, 00:17:36.591 "data_offset": 0, 00:17:36.591 "data_size": 0 00:17:36.591 }, 00:17:36.591 { 00:17:36.591 "name": "BaseBdev4", 00:17:36.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.591 "is_configured": false, 00:17:36.591 "data_offset": 0, 00:17:36.591 "data_size": 0 00:17:36.591 } 00:17:36.591 ] 00:17:36.591 }' 00:17:36.591 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.591 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:37.158 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:37.418 [2024-07-12 13:43:25.962454] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:37.418 BaseBdev2 00:17:37.418 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:37.418 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:37.418 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:37.418 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:37.418 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:37.418 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:37.418 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:37.677 13:43:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:37.936 [ 00:17:37.936 { 00:17:37.936 "name": "BaseBdev2", 00:17:37.936 "aliases": [ 00:17:37.936 "a336eb16-2153-4aa1-840a-b723ce13379e" 00:17:37.936 ], 00:17:37.936 "product_name": "Malloc disk", 00:17:37.936 "block_size": 512, 00:17:37.936 "num_blocks": 65536, 00:17:37.936 "uuid": "a336eb16-2153-4aa1-840a-b723ce13379e", 00:17:37.936 "assigned_rate_limits": { 00:17:37.936 "rw_ios_per_sec": 0, 00:17:37.936 "rw_mbytes_per_sec": 0, 00:17:37.936 "r_mbytes_per_sec": 0, 00:17:37.936 "w_mbytes_per_sec": 0 00:17:37.936 }, 00:17:37.936 "claimed": true, 00:17:37.936 "claim_type": "exclusive_write", 00:17:37.936 "zoned": false, 00:17:37.936 "supported_io_types": { 00:17:37.936 "read": true, 00:17:37.936 "write": true, 00:17:37.936 "unmap": true, 00:17:37.936 "flush": true, 00:17:37.936 "reset": true, 00:17:37.936 "nvme_admin": false, 00:17:37.936 "nvme_io": false, 00:17:37.936 "nvme_io_md": false, 00:17:37.936 "write_zeroes": true, 00:17:37.936 "zcopy": true, 00:17:37.936 "get_zone_info": false, 00:17:37.936 "zone_management": false, 00:17:37.936 "zone_append": false, 00:17:37.936 "compare": false, 00:17:37.936 "compare_and_write": false, 00:17:37.936 "abort": true, 00:17:37.936 "seek_hole": false, 00:17:37.936 "seek_data": false, 00:17:37.936 "copy": true, 00:17:37.936 "nvme_iov_md": false 00:17:37.936 }, 00:17:37.936 "memory_domains": [ 00:17:37.936 { 00:17:37.936 "dma_device_id": "system", 00:17:37.936 "dma_device_type": 1 00:17:37.936 }, 00:17:37.936 { 00:17:37.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.936 "dma_device_type": 2 00:17:37.936 } 00:17:37.936 ], 00:17:37.936 "driver_specific": {} 00:17:37.936 } 00:17:37.936 ] 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.936 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.195 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.195 "name": "Existed_Raid", 00:17:38.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.195 "strip_size_kb": 64, 00:17:38.195 "state": "configuring", 00:17:38.195 "raid_level": "raid0", 00:17:38.195 "superblock": false, 00:17:38.195 "num_base_bdevs": 4, 00:17:38.195 "num_base_bdevs_discovered": 2, 00:17:38.195 "num_base_bdevs_operational": 4, 00:17:38.195 "base_bdevs_list": [ 00:17:38.195 { 00:17:38.195 "name": "BaseBdev1", 00:17:38.195 "uuid": "4957fce9-682b-462d-acb7-ff3a5fe30b1c", 00:17:38.195 "is_configured": true, 00:17:38.195 "data_offset": 0, 00:17:38.195 "data_size": 65536 00:17:38.195 }, 00:17:38.195 { 00:17:38.195 "name": "BaseBdev2", 00:17:38.195 "uuid": "a336eb16-2153-4aa1-840a-b723ce13379e", 00:17:38.195 "is_configured": true, 00:17:38.195 "data_offset": 0, 00:17:38.195 "data_size": 65536 00:17:38.195 }, 00:17:38.195 { 00:17:38.195 "name": "BaseBdev3", 00:17:38.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.195 "is_configured": false, 00:17:38.195 "data_offset": 0, 00:17:38.195 "data_size": 0 00:17:38.195 }, 00:17:38.195 { 00:17:38.195 "name": "BaseBdev4", 00:17:38.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.195 "is_configured": false, 00:17:38.195 "data_offset": 0, 00:17:38.195 "data_size": 0 00:17:38.195 } 00:17:38.195 ] 00:17:38.195 }' 00:17:38.195 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.195 13:43:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.764 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:39.023 [2024-07-12 13:43:27.485992] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:39.023 BaseBdev3 00:17:39.023 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:39.023 13:43:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:39.023 13:43:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:39.023 13:43:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:39.023 13:43:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:39.023 13:43:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:39.023 13:43:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:39.282 13:43:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:39.542 [ 00:17:39.542 { 00:17:39.542 "name": "BaseBdev3", 00:17:39.542 "aliases": [ 00:17:39.542 "e6888bf7-3de2-4118-941d-7cae8e5b33b1" 00:17:39.542 ], 00:17:39.542 "product_name": "Malloc disk", 00:17:39.542 "block_size": 512, 00:17:39.542 "num_blocks": 65536, 00:17:39.542 "uuid": "e6888bf7-3de2-4118-941d-7cae8e5b33b1", 00:17:39.542 "assigned_rate_limits": { 00:17:39.542 "rw_ios_per_sec": 0, 00:17:39.542 "rw_mbytes_per_sec": 0, 00:17:39.542 "r_mbytes_per_sec": 0, 00:17:39.542 "w_mbytes_per_sec": 0 00:17:39.542 }, 00:17:39.542 "claimed": true, 00:17:39.542 "claim_type": "exclusive_write", 00:17:39.542 "zoned": false, 00:17:39.542 "supported_io_types": { 00:17:39.542 "read": true, 00:17:39.542 "write": true, 00:17:39.542 "unmap": true, 00:17:39.542 "flush": true, 00:17:39.542 "reset": true, 00:17:39.542 "nvme_admin": false, 00:17:39.542 "nvme_io": false, 00:17:39.542 "nvme_io_md": false, 00:17:39.542 "write_zeroes": true, 00:17:39.542 "zcopy": true, 00:17:39.542 "get_zone_info": false, 00:17:39.542 "zone_management": false, 00:17:39.542 "zone_append": false, 00:17:39.542 "compare": false, 00:17:39.542 "compare_and_write": false, 00:17:39.542 "abort": true, 00:17:39.542 "seek_hole": false, 00:17:39.542 "seek_data": false, 00:17:39.542 "copy": true, 00:17:39.542 "nvme_iov_md": false 00:17:39.542 }, 00:17:39.542 "memory_domains": [ 00:17:39.542 { 00:17:39.542 "dma_device_id": "system", 00:17:39.542 "dma_device_type": 1 00:17:39.542 }, 00:17:39.542 { 00:17:39.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.542 "dma_device_type": 2 00:17:39.542 } 00:17:39.542 ], 00:17:39.542 "driver_specific": {} 00:17:39.542 } 00:17:39.542 ] 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.542 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.802 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.802 "name": "Existed_Raid", 00:17:39.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.802 "strip_size_kb": 64, 00:17:39.802 "state": "configuring", 00:17:39.802 "raid_level": "raid0", 00:17:39.802 "superblock": false, 00:17:39.802 "num_base_bdevs": 4, 00:17:39.802 "num_base_bdevs_discovered": 3, 00:17:39.802 "num_base_bdevs_operational": 4, 00:17:39.802 "base_bdevs_list": [ 00:17:39.802 { 00:17:39.802 "name": "BaseBdev1", 00:17:39.802 "uuid": "4957fce9-682b-462d-acb7-ff3a5fe30b1c", 00:17:39.802 "is_configured": true, 00:17:39.802 "data_offset": 0, 00:17:39.802 "data_size": 65536 00:17:39.802 }, 00:17:39.802 { 00:17:39.802 "name": "BaseBdev2", 00:17:39.802 "uuid": "a336eb16-2153-4aa1-840a-b723ce13379e", 00:17:39.802 "is_configured": true, 00:17:39.802 "data_offset": 0, 00:17:39.802 "data_size": 65536 00:17:39.802 }, 00:17:39.802 { 00:17:39.802 "name": "BaseBdev3", 00:17:39.802 "uuid": "e6888bf7-3de2-4118-941d-7cae8e5b33b1", 00:17:39.802 "is_configured": true, 00:17:39.802 "data_offset": 0, 00:17:39.802 "data_size": 65536 00:17:39.802 }, 00:17:39.802 { 00:17:39.802 "name": "BaseBdev4", 00:17:39.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.802 "is_configured": false, 00:17:39.802 "data_offset": 0, 00:17:39.802 "data_size": 0 00:17:39.802 } 00:17:39.802 ] 00:17:39.802 }' 00:17:39.802 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.802 13:43:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.369 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:40.628 [2024-07-12 13:43:29.097836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:40.628 [2024-07-12 13:43:29.097875] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cf8c40 00:17:40.628 [2024-07-12 13:43:29.097884] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:40.628 [2024-07-12 13:43:29.098112] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf98c0 00:17:40.628 [2024-07-12 13:43:29.098235] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cf8c40 00:17:40.628 [2024-07-12 13:43:29.098245] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cf8c40 00:17:40.628 [2024-07-12 13:43:29.098415] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:40.628 BaseBdev4 00:17:40.628 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:40.628 13:43:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:40.628 13:43:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:40.628 13:43:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:40.628 13:43:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:40.628 13:43:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:40.628 13:43:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:40.887 13:43:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:41.146 [ 00:17:41.146 { 00:17:41.146 "name": "BaseBdev4", 00:17:41.146 "aliases": [ 00:17:41.146 "b92287d0-0e60-43fd-a97d-e989a19cea56" 00:17:41.146 ], 00:17:41.146 "product_name": "Malloc disk", 00:17:41.146 "block_size": 512, 00:17:41.146 "num_blocks": 65536, 00:17:41.146 "uuid": "b92287d0-0e60-43fd-a97d-e989a19cea56", 00:17:41.146 "assigned_rate_limits": { 00:17:41.146 "rw_ios_per_sec": 0, 00:17:41.146 "rw_mbytes_per_sec": 0, 00:17:41.146 "r_mbytes_per_sec": 0, 00:17:41.146 "w_mbytes_per_sec": 0 00:17:41.146 }, 00:17:41.146 "claimed": true, 00:17:41.146 "claim_type": "exclusive_write", 00:17:41.146 "zoned": false, 00:17:41.146 "supported_io_types": { 00:17:41.146 "read": true, 00:17:41.146 "write": true, 00:17:41.146 "unmap": true, 00:17:41.146 "flush": true, 00:17:41.146 "reset": true, 00:17:41.146 "nvme_admin": false, 00:17:41.146 "nvme_io": false, 00:17:41.146 "nvme_io_md": false, 00:17:41.146 "write_zeroes": true, 00:17:41.146 "zcopy": true, 00:17:41.146 "get_zone_info": false, 00:17:41.146 "zone_management": false, 00:17:41.146 "zone_append": false, 00:17:41.146 "compare": false, 00:17:41.146 "compare_and_write": false, 00:17:41.146 "abort": true, 00:17:41.146 "seek_hole": false, 00:17:41.146 "seek_data": false, 00:17:41.146 "copy": true, 00:17:41.146 "nvme_iov_md": false 00:17:41.146 }, 00:17:41.146 "memory_domains": [ 00:17:41.146 { 00:17:41.146 "dma_device_id": "system", 00:17:41.146 "dma_device_type": 1 00:17:41.146 }, 00:17:41.146 { 00:17:41.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.146 "dma_device_type": 2 00:17:41.146 } 00:17:41.146 ], 00:17:41.146 "driver_specific": {} 00:17:41.146 } 00:17:41.146 ] 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.146 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.405 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.405 "name": "Existed_Raid", 00:17:41.405 "uuid": "ee9d2df2-d164-470b-a19c-68e4012ec5f7", 00:17:41.405 "strip_size_kb": 64, 00:17:41.405 "state": "online", 00:17:41.405 "raid_level": "raid0", 00:17:41.405 "superblock": false, 00:17:41.405 "num_base_bdevs": 4, 00:17:41.405 "num_base_bdevs_discovered": 4, 00:17:41.405 "num_base_bdevs_operational": 4, 00:17:41.405 "base_bdevs_list": [ 00:17:41.405 { 00:17:41.405 "name": "BaseBdev1", 00:17:41.405 "uuid": "4957fce9-682b-462d-acb7-ff3a5fe30b1c", 00:17:41.405 "is_configured": true, 00:17:41.405 "data_offset": 0, 00:17:41.405 "data_size": 65536 00:17:41.405 }, 00:17:41.405 { 00:17:41.405 "name": "BaseBdev2", 00:17:41.405 "uuid": "a336eb16-2153-4aa1-840a-b723ce13379e", 00:17:41.405 "is_configured": true, 00:17:41.405 "data_offset": 0, 00:17:41.405 "data_size": 65536 00:17:41.405 }, 00:17:41.405 { 00:17:41.405 "name": "BaseBdev3", 00:17:41.405 "uuid": "e6888bf7-3de2-4118-941d-7cae8e5b33b1", 00:17:41.405 "is_configured": true, 00:17:41.405 "data_offset": 0, 00:17:41.405 "data_size": 65536 00:17:41.405 }, 00:17:41.405 { 00:17:41.405 "name": "BaseBdev4", 00:17:41.405 "uuid": "b92287d0-0e60-43fd-a97d-e989a19cea56", 00:17:41.405 "is_configured": true, 00:17:41.405 "data_offset": 0, 00:17:41.405 "data_size": 65536 00:17:41.405 } 00:17:41.405 ] 00:17:41.405 }' 00:17:41.405 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.405 13:43:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.974 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:41.974 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:41.974 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:41.974 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:41.974 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:41.974 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:41.974 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:41.974 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:42.234 [2024-07-12 13:43:30.626240] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:42.234 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:42.234 "name": "Existed_Raid", 00:17:42.234 "aliases": [ 00:17:42.234 "ee9d2df2-d164-470b-a19c-68e4012ec5f7" 00:17:42.234 ], 00:17:42.234 "product_name": "Raid Volume", 00:17:42.234 "block_size": 512, 00:17:42.234 "num_blocks": 262144, 00:17:42.234 "uuid": "ee9d2df2-d164-470b-a19c-68e4012ec5f7", 00:17:42.234 "assigned_rate_limits": { 00:17:42.234 "rw_ios_per_sec": 0, 00:17:42.234 "rw_mbytes_per_sec": 0, 00:17:42.234 "r_mbytes_per_sec": 0, 00:17:42.234 "w_mbytes_per_sec": 0 00:17:42.234 }, 00:17:42.234 "claimed": false, 00:17:42.234 "zoned": false, 00:17:42.234 "supported_io_types": { 00:17:42.234 "read": true, 00:17:42.234 "write": true, 00:17:42.234 "unmap": true, 00:17:42.234 "flush": true, 00:17:42.234 "reset": true, 00:17:42.234 "nvme_admin": false, 00:17:42.234 "nvme_io": false, 00:17:42.234 "nvme_io_md": false, 00:17:42.234 "write_zeroes": true, 00:17:42.234 "zcopy": false, 00:17:42.234 "get_zone_info": false, 00:17:42.234 "zone_management": false, 00:17:42.234 "zone_append": false, 00:17:42.234 "compare": false, 00:17:42.234 "compare_and_write": false, 00:17:42.234 "abort": false, 00:17:42.234 "seek_hole": false, 00:17:42.234 "seek_data": false, 00:17:42.234 "copy": false, 00:17:42.234 "nvme_iov_md": false 00:17:42.234 }, 00:17:42.234 "memory_domains": [ 00:17:42.234 { 00:17:42.234 "dma_device_id": "system", 00:17:42.234 "dma_device_type": 1 00:17:42.234 }, 00:17:42.234 { 00:17:42.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.234 "dma_device_type": 2 00:17:42.234 }, 00:17:42.234 { 00:17:42.234 "dma_device_id": "system", 00:17:42.234 "dma_device_type": 1 00:17:42.234 }, 00:17:42.234 { 00:17:42.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.234 "dma_device_type": 2 00:17:42.234 }, 00:17:42.234 { 00:17:42.234 "dma_device_id": "system", 00:17:42.234 "dma_device_type": 1 00:17:42.234 }, 00:17:42.234 { 00:17:42.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.234 "dma_device_type": 2 00:17:42.234 }, 00:17:42.234 { 00:17:42.234 "dma_device_id": "system", 00:17:42.234 "dma_device_type": 1 00:17:42.234 }, 00:17:42.234 { 00:17:42.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.234 "dma_device_type": 2 00:17:42.234 } 00:17:42.234 ], 00:17:42.234 "driver_specific": { 00:17:42.234 "raid": { 00:17:42.234 "uuid": "ee9d2df2-d164-470b-a19c-68e4012ec5f7", 00:17:42.234 "strip_size_kb": 64, 00:17:42.234 "state": "online", 00:17:42.234 "raid_level": "raid0", 00:17:42.234 "superblock": false, 00:17:42.234 "num_base_bdevs": 4, 00:17:42.234 "num_base_bdevs_discovered": 4, 00:17:42.234 "num_base_bdevs_operational": 4, 00:17:42.234 "base_bdevs_list": [ 00:17:42.234 { 00:17:42.234 "name": "BaseBdev1", 00:17:42.234 "uuid": "4957fce9-682b-462d-acb7-ff3a5fe30b1c", 00:17:42.234 "is_configured": true, 00:17:42.234 "data_offset": 0, 00:17:42.234 "data_size": 65536 00:17:42.234 }, 00:17:42.234 { 00:17:42.234 "name": "BaseBdev2", 00:17:42.234 "uuid": "a336eb16-2153-4aa1-840a-b723ce13379e", 00:17:42.234 "is_configured": true, 00:17:42.234 "data_offset": 0, 00:17:42.234 "data_size": 65536 00:17:42.234 }, 00:17:42.234 { 00:17:42.234 "name": "BaseBdev3", 00:17:42.234 "uuid": "e6888bf7-3de2-4118-941d-7cae8e5b33b1", 00:17:42.234 "is_configured": true, 00:17:42.234 "data_offset": 0, 00:17:42.234 "data_size": 65536 00:17:42.234 }, 00:17:42.234 { 00:17:42.234 "name": "BaseBdev4", 00:17:42.234 "uuid": "b92287d0-0e60-43fd-a97d-e989a19cea56", 00:17:42.234 "is_configured": true, 00:17:42.234 "data_offset": 0, 00:17:42.234 "data_size": 65536 00:17:42.234 } 00:17:42.234 ] 00:17:42.234 } 00:17:42.234 } 00:17:42.234 }' 00:17:42.234 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:42.234 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:42.234 BaseBdev2 00:17:42.234 BaseBdev3 00:17:42.234 BaseBdev4' 00:17:42.234 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.234 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:42.234 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:42.494 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:42.494 "name": "BaseBdev1", 00:17:42.494 "aliases": [ 00:17:42.494 "4957fce9-682b-462d-acb7-ff3a5fe30b1c" 00:17:42.494 ], 00:17:42.494 "product_name": "Malloc disk", 00:17:42.494 "block_size": 512, 00:17:42.494 "num_blocks": 65536, 00:17:42.494 "uuid": "4957fce9-682b-462d-acb7-ff3a5fe30b1c", 00:17:42.494 "assigned_rate_limits": { 00:17:42.494 "rw_ios_per_sec": 0, 00:17:42.494 "rw_mbytes_per_sec": 0, 00:17:42.494 "r_mbytes_per_sec": 0, 00:17:42.494 "w_mbytes_per_sec": 0 00:17:42.494 }, 00:17:42.494 "claimed": true, 00:17:42.494 "claim_type": "exclusive_write", 00:17:42.494 "zoned": false, 00:17:42.494 "supported_io_types": { 00:17:42.494 "read": true, 00:17:42.494 "write": true, 00:17:42.494 "unmap": true, 00:17:42.494 "flush": true, 00:17:42.494 "reset": true, 00:17:42.494 "nvme_admin": false, 00:17:42.494 "nvme_io": false, 00:17:42.494 "nvme_io_md": false, 00:17:42.494 "write_zeroes": true, 00:17:42.494 "zcopy": true, 00:17:42.494 "get_zone_info": false, 00:17:42.494 "zone_management": false, 00:17:42.494 "zone_append": false, 00:17:42.494 "compare": false, 00:17:42.494 "compare_and_write": false, 00:17:42.494 "abort": true, 00:17:42.494 "seek_hole": false, 00:17:42.494 "seek_data": false, 00:17:42.494 "copy": true, 00:17:42.494 "nvme_iov_md": false 00:17:42.494 }, 00:17:42.494 "memory_domains": [ 00:17:42.494 { 00:17:42.494 "dma_device_id": "system", 00:17:42.494 "dma_device_type": 1 00:17:42.494 }, 00:17:42.494 { 00:17:42.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.494 "dma_device_type": 2 00:17:42.494 } 00:17:42.494 ], 00:17:42.494 "driver_specific": {} 00:17:42.494 }' 00:17:42.494 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.494 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.494 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:42.494 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.494 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.753 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:42.753 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.753 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.753 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:42.753 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.753 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.753 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:42.753 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.753 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:42.753 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:43.012 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:43.012 "name": "BaseBdev2", 00:17:43.012 "aliases": [ 00:17:43.012 "a336eb16-2153-4aa1-840a-b723ce13379e" 00:17:43.012 ], 00:17:43.012 "product_name": "Malloc disk", 00:17:43.012 "block_size": 512, 00:17:43.012 "num_blocks": 65536, 00:17:43.012 "uuid": "a336eb16-2153-4aa1-840a-b723ce13379e", 00:17:43.012 "assigned_rate_limits": { 00:17:43.012 "rw_ios_per_sec": 0, 00:17:43.012 "rw_mbytes_per_sec": 0, 00:17:43.012 "r_mbytes_per_sec": 0, 00:17:43.012 "w_mbytes_per_sec": 0 00:17:43.012 }, 00:17:43.012 "claimed": true, 00:17:43.012 "claim_type": "exclusive_write", 00:17:43.012 "zoned": false, 00:17:43.012 "supported_io_types": { 00:17:43.012 "read": true, 00:17:43.012 "write": true, 00:17:43.012 "unmap": true, 00:17:43.012 "flush": true, 00:17:43.012 "reset": true, 00:17:43.012 "nvme_admin": false, 00:17:43.012 "nvme_io": false, 00:17:43.012 "nvme_io_md": false, 00:17:43.012 "write_zeroes": true, 00:17:43.012 "zcopy": true, 00:17:43.012 "get_zone_info": false, 00:17:43.012 "zone_management": false, 00:17:43.012 "zone_append": false, 00:17:43.012 "compare": false, 00:17:43.012 "compare_and_write": false, 00:17:43.012 "abort": true, 00:17:43.012 "seek_hole": false, 00:17:43.012 "seek_data": false, 00:17:43.012 "copy": true, 00:17:43.012 "nvme_iov_md": false 00:17:43.012 }, 00:17:43.012 "memory_domains": [ 00:17:43.012 { 00:17:43.012 "dma_device_id": "system", 00:17:43.012 "dma_device_type": 1 00:17:43.012 }, 00:17:43.012 { 00:17:43.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.012 "dma_device_type": 2 00:17:43.012 } 00:17:43.012 ], 00:17:43.012 "driver_specific": {} 00:17:43.012 }' 00:17:43.012 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.012 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.271 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:43.271 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.271 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.271 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:43.271 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.271 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.271 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:43.271 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.271 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.531 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:43.531 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:43.531 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:43.531 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:43.790 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:43.790 "name": "BaseBdev3", 00:17:43.790 "aliases": [ 00:17:43.790 "e6888bf7-3de2-4118-941d-7cae8e5b33b1" 00:17:43.790 ], 00:17:43.790 "product_name": "Malloc disk", 00:17:43.790 "block_size": 512, 00:17:43.790 "num_blocks": 65536, 00:17:43.790 "uuid": "e6888bf7-3de2-4118-941d-7cae8e5b33b1", 00:17:43.790 "assigned_rate_limits": { 00:17:43.790 "rw_ios_per_sec": 0, 00:17:43.790 "rw_mbytes_per_sec": 0, 00:17:43.790 "r_mbytes_per_sec": 0, 00:17:43.790 "w_mbytes_per_sec": 0 00:17:43.790 }, 00:17:43.790 "claimed": true, 00:17:43.790 "claim_type": "exclusive_write", 00:17:43.790 "zoned": false, 00:17:43.790 "supported_io_types": { 00:17:43.790 "read": true, 00:17:43.790 "write": true, 00:17:43.790 "unmap": true, 00:17:43.790 "flush": true, 00:17:43.790 "reset": true, 00:17:43.790 "nvme_admin": false, 00:17:43.790 "nvme_io": false, 00:17:43.790 "nvme_io_md": false, 00:17:43.790 "write_zeroes": true, 00:17:43.790 "zcopy": true, 00:17:43.790 "get_zone_info": false, 00:17:43.790 "zone_management": false, 00:17:43.790 "zone_append": false, 00:17:43.790 "compare": false, 00:17:43.790 "compare_and_write": false, 00:17:43.790 "abort": true, 00:17:43.790 "seek_hole": false, 00:17:43.790 "seek_data": false, 00:17:43.790 "copy": true, 00:17:43.790 "nvme_iov_md": false 00:17:43.790 }, 00:17:43.790 "memory_domains": [ 00:17:43.790 { 00:17:43.791 "dma_device_id": "system", 00:17:43.791 "dma_device_type": 1 00:17:43.791 }, 00:17:43.791 { 00:17:43.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.791 "dma_device_type": 2 00:17:43.791 } 00:17:43.791 ], 00:17:43.791 "driver_specific": {} 00:17:43.791 }' 00:17:43.791 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.791 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.791 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:43.791 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.791 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.791 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:43.791 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.791 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.050 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:44.050 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.050 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.050 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:44.050 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:44.050 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:44.050 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:44.309 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:44.309 "name": "BaseBdev4", 00:17:44.309 "aliases": [ 00:17:44.309 "b92287d0-0e60-43fd-a97d-e989a19cea56" 00:17:44.309 ], 00:17:44.309 "product_name": "Malloc disk", 00:17:44.309 "block_size": 512, 00:17:44.309 "num_blocks": 65536, 00:17:44.309 "uuid": "b92287d0-0e60-43fd-a97d-e989a19cea56", 00:17:44.309 "assigned_rate_limits": { 00:17:44.309 "rw_ios_per_sec": 0, 00:17:44.309 "rw_mbytes_per_sec": 0, 00:17:44.309 "r_mbytes_per_sec": 0, 00:17:44.309 "w_mbytes_per_sec": 0 00:17:44.309 }, 00:17:44.309 "claimed": true, 00:17:44.309 "claim_type": "exclusive_write", 00:17:44.309 "zoned": false, 00:17:44.309 "supported_io_types": { 00:17:44.309 "read": true, 00:17:44.309 "write": true, 00:17:44.309 "unmap": true, 00:17:44.309 "flush": true, 00:17:44.309 "reset": true, 00:17:44.309 "nvme_admin": false, 00:17:44.309 "nvme_io": false, 00:17:44.309 "nvme_io_md": false, 00:17:44.309 "write_zeroes": true, 00:17:44.309 "zcopy": true, 00:17:44.309 "get_zone_info": false, 00:17:44.309 "zone_management": false, 00:17:44.309 "zone_append": false, 00:17:44.309 "compare": false, 00:17:44.309 "compare_and_write": false, 00:17:44.309 "abort": true, 00:17:44.309 "seek_hole": false, 00:17:44.309 "seek_data": false, 00:17:44.309 "copy": true, 00:17:44.309 "nvme_iov_md": false 00:17:44.309 }, 00:17:44.309 "memory_domains": [ 00:17:44.309 { 00:17:44.309 "dma_device_id": "system", 00:17:44.309 "dma_device_type": 1 00:17:44.309 }, 00:17:44.309 { 00:17:44.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.309 "dma_device_type": 2 00:17:44.309 } 00:17:44.309 ], 00:17:44.309 "driver_specific": {} 00:17:44.309 }' 00:17:44.309 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:44.309 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:44.309 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:44.309 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:44.309 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:44.584 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:44.584 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.584 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.584 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:44.584 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.584 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.584 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:44.584 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:44.842 [2024-07-12 13:43:33.309094] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:44.842 [2024-07-12 13:43:33.309125] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:44.842 [2024-07-12 13:43:33.309176] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.842 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:45.101 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.101 "name": "Existed_Raid", 00:17:45.101 "uuid": "ee9d2df2-d164-470b-a19c-68e4012ec5f7", 00:17:45.101 "strip_size_kb": 64, 00:17:45.101 "state": "offline", 00:17:45.101 "raid_level": "raid0", 00:17:45.101 "superblock": false, 00:17:45.101 "num_base_bdevs": 4, 00:17:45.101 "num_base_bdevs_discovered": 3, 00:17:45.101 "num_base_bdevs_operational": 3, 00:17:45.101 "base_bdevs_list": [ 00:17:45.101 { 00:17:45.101 "name": null, 00:17:45.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.101 "is_configured": false, 00:17:45.101 "data_offset": 0, 00:17:45.101 "data_size": 65536 00:17:45.101 }, 00:17:45.101 { 00:17:45.101 "name": "BaseBdev2", 00:17:45.101 "uuid": "a336eb16-2153-4aa1-840a-b723ce13379e", 00:17:45.101 "is_configured": true, 00:17:45.101 "data_offset": 0, 00:17:45.101 "data_size": 65536 00:17:45.101 }, 00:17:45.101 { 00:17:45.101 "name": "BaseBdev3", 00:17:45.101 "uuid": "e6888bf7-3de2-4118-941d-7cae8e5b33b1", 00:17:45.101 "is_configured": true, 00:17:45.101 "data_offset": 0, 00:17:45.101 "data_size": 65536 00:17:45.101 }, 00:17:45.101 { 00:17:45.101 "name": "BaseBdev4", 00:17:45.101 "uuid": "b92287d0-0e60-43fd-a97d-e989a19cea56", 00:17:45.101 "is_configured": true, 00:17:45.101 "data_offset": 0, 00:17:45.101 "data_size": 65536 00:17:45.101 } 00:17:45.101 ] 00:17:45.101 }' 00:17:45.101 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.101 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.669 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:45.669 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:45.669 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:45.669 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.928 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:45.928 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:45.928 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:46.496 [2024-07-12 13:43:34.919338] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:46.496 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:46.496 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:46.496 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.496 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:46.755 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:46.755 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:46.755 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:47.323 [2024-07-12 13:43:35.688025] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:47.323 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:47.323 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:47.323 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.323 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:47.582 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:47.582 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:47.582 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:47.840 [2024-07-12 13:43:36.197846] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:47.840 [2024-07-12 13:43:36.197893] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cf8c40 name Existed_Raid, state offline 00:17:47.841 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:47.841 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:47.841 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.841 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:48.099 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:48.099 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:48.099 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:48.099 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:48.099 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:48.099 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:48.358 BaseBdev2 00:17:48.358 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:48.358 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:48.358 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:48.358 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:48.358 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:48.358 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:48.358 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:48.616 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:48.616 [ 00:17:48.616 { 00:17:48.616 "name": "BaseBdev2", 00:17:48.616 "aliases": [ 00:17:48.616 "eddcc6a7-98cc-4548-983d-ffc355164718" 00:17:48.616 ], 00:17:48.616 "product_name": "Malloc disk", 00:17:48.616 "block_size": 512, 00:17:48.616 "num_blocks": 65536, 00:17:48.616 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:17:48.616 "assigned_rate_limits": { 00:17:48.616 "rw_ios_per_sec": 0, 00:17:48.616 "rw_mbytes_per_sec": 0, 00:17:48.616 "r_mbytes_per_sec": 0, 00:17:48.616 "w_mbytes_per_sec": 0 00:17:48.616 }, 00:17:48.616 "claimed": false, 00:17:48.616 "zoned": false, 00:17:48.616 "supported_io_types": { 00:17:48.616 "read": true, 00:17:48.616 "write": true, 00:17:48.617 "unmap": true, 00:17:48.617 "flush": true, 00:17:48.617 "reset": true, 00:17:48.617 "nvme_admin": false, 00:17:48.617 "nvme_io": false, 00:17:48.617 "nvme_io_md": false, 00:17:48.617 "write_zeroes": true, 00:17:48.617 "zcopy": true, 00:17:48.617 "get_zone_info": false, 00:17:48.617 "zone_management": false, 00:17:48.617 "zone_append": false, 00:17:48.617 "compare": false, 00:17:48.617 "compare_and_write": false, 00:17:48.617 "abort": true, 00:17:48.617 "seek_hole": false, 00:17:48.617 "seek_data": false, 00:17:48.617 "copy": true, 00:17:48.617 "nvme_iov_md": false 00:17:48.617 }, 00:17:48.617 "memory_domains": [ 00:17:48.617 { 00:17:48.617 "dma_device_id": "system", 00:17:48.617 "dma_device_type": 1 00:17:48.617 }, 00:17:48.617 { 00:17:48.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.617 "dma_device_type": 2 00:17:48.617 } 00:17:48.617 ], 00:17:48.617 "driver_specific": {} 00:17:48.617 } 00:17:48.617 ] 00:17:48.617 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:48.617 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:48.617 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:48.617 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:48.875 BaseBdev3 00:17:48.875 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:48.875 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:48.875 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:48.875 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:48.875 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:48.875 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:48.875 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:49.134 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:49.393 [ 00:17:49.393 { 00:17:49.393 "name": "BaseBdev3", 00:17:49.393 "aliases": [ 00:17:49.393 "410b672f-7a0c-4ef2-8fe6-b44efcc5985c" 00:17:49.393 ], 00:17:49.393 "product_name": "Malloc disk", 00:17:49.393 "block_size": 512, 00:17:49.393 "num_blocks": 65536, 00:17:49.393 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:17:49.393 "assigned_rate_limits": { 00:17:49.393 "rw_ios_per_sec": 0, 00:17:49.393 "rw_mbytes_per_sec": 0, 00:17:49.393 "r_mbytes_per_sec": 0, 00:17:49.393 "w_mbytes_per_sec": 0 00:17:49.393 }, 00:17:49.393 "claimed": false, 00:17:49.393 "zoned": false, 00:17:49.393 "supported_io_types": { 00:17:49.393 "read": true, 00:17:49.393 "write": true, 00:17:49.393 "unmap": true, 00:17:49.393 "flush": true, 00:17:49.393 "reset": true, 00:17:49.393 "nvme_admin": false, 00:17:49.393 "nvme_io": false, 00:17:49.393 "nvme_io_md": false, 00:17:49.393 "write_zeroes": true, 00:17:49.393 "zcopy": true, 00:17:49.393 "get_zone_info": false, 00:17:49.393 "zone_management": false, 00:17:49.393 "zone_append": false, 00:17:49.393 "compare": false, 00:17:49.393 "compare_and_write": false, 00:17:49.393 "abort": true, 00:17:49.393 "seek_hole": false, 00:17:49.393 "seek_data": false, 00:17:49.393 "copy": true, 00:17:49.393 "nvme_iov_md": false 00:17:49.393 }, 00:17:49.393 "memory_domains": [ 00:17:49.393 { 00:17:49.393 "dma_device_id": "system", 00:17:49.393 "dma_device_type": 1 00:17:49.393 }, 00:17:49.393 { 00:17:49.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.393 "dma_device_type": 2 00:17:49.393 } 00:17:49.393 ], 00:17:49.393 "driver_specific": {} 00:17:49.393 } 00:17:49.393 ] 00:17:49.393 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:49.393 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:49.393 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:49.393 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:49.652 BaseBdev4 00:17:49.652 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:49.652 13:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:49.652 13:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:49.652 13:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:49.652 13:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:49.652 13:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:49.652 13:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:49.911 13:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:50.170 [ 00:17:50.170 { 00:17:50.170 "name": "BaseBdev4", 00:17:50.170 "aliases": [ 00:17:50.170 "7a17fbf4-75e5-4592-b956-32e33331bb2f" 00:17:50.170 ], 00:17:50.170 "product_name": "Malloc disk", 00:17:50.170 "block_size": 512, 00:17:50.170 "num_blocks": 65536, 00:17:50.170 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:17:50.170 "assigned_rate_limits": { 00:17:50.170 "rw_ios_per_sec": 0, 00:17:50.170 "rw_mbytes_per_sec": 0, 00:17:50.170 "r_mbytes_per_sec": 0, 00:17:50.170 "w_mbytes_per_sec": 0 00:17:50.170 }, 00:17:50.170 "claimed": false, 00:17:50.170 "zoned": false, 00:17:50.170 "supported_io_types": { 00:17:50.170 "read": true, 00:17:50.170 "write": true, 00:17:50.170 "unmap": true, 00:17:50.170 "flush": true, 00:17:50.170 "reset": true, 00:17:50.170 "nvme_admin": false, 00:17:50.170 "nvme_io": false, 00:17:50.170 "nvme_io_md": false, 00:17:50.170 "write_zeroes": true, 00:17:50.170 "zcopy": true, 00:17:50.170 "get_zone_info": false, 00:17:50.170 "zone_management": false, 00:17:50.170 "zone_append": false, 00:17:50.170 "compare": false, 00:17:50.170 "compare_and_write": false, 00:17:50.170 "abort": true, 00:17:50.170 "seek_hole": false, 00:17:50.170 "seek_data": false, 00:17:50.170 "copy": true, 00:17:50.170 "nvme_iov_md": false 00:17:50.170 }, 00:17:50.170 "memory_domains": [ 00:17:50.170 { 00:17:50.170 "dma_device_id": "system", 00:17:50.170 "dma_device_type": 1 00:17:50.170 }, 00:17:50.170 { 00:17:50.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.170 "dma_device_type": 2 00:17:50.170 } 00:17:50.170 ], 00:17:50.170 "driver_specific": {} 00:17:50.170 } 00:17:50.170 ] 00:17:50.170 13:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:50.170 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:50.170 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:50.170 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:50.736 [2024-07-12 13:43:39.146147] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:50.736 [2024-07-12 13:43:39.146203] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:50.736 [2024-07-12 13:43:39.146225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:50.736 [2024-07-12 13:43:39.147565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:50.736 [2024-07-12 13:43:39.147607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.736 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:50.994 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.994 "name": "Existed_Raid", 00:17:50.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.994 "strip_size_kb": 64, 00:17:50.994 "state": "configuring", 00:17:50.994 "raid_level": "raid0", 00:17:50.994 "superblock": false, 00:17:50.994 "num_base_bdevs": 4, 00:17:50.994 "num_base_bdevs_discovered": 3, 00:17:50.994 "num_base_bdevs_operational": 4, 00:17:50.994 "base_bdevs_list": [ 00:17:50.994 { 00:17:50.994 "name": "BaseBdev1", 00:17:50.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.994 "is_configured": false, 00:17:50.994 "data_offset": 0, 00:17:50.994 "data_size": 0 00:17:50.994 }, 00:17:50.994 { 00:17:50.994 "name": "BaseBdev2", 00:17:50.994 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:17:50.994 "is_configured": true, 00:17:50.994 "data_offset": 0, 00:17:50.994 "data_size": 65536 00:17:50.994 }, 00:17:50.994 { 00:17:50.994 "name": "BaseBdev3", 00:17:50.994 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:17:50.994 "is_configured": true, 00:17:50.994 "data_offset": 0, 00:17:50.994 "data_size": 65536 00:17:50.994 }, 00:17:50.994 { 00:17:50.994 "name": "BaseBdev4", 00:17:50.994 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:17:50.994 "is_configured": true, 00:17:50.994 "data_offset": 0, 00:17:50.994 "data_size": 65536 00:17:50.994 } 00:17:50.994 ] 00:17:50.994 }' 00:17:50.994 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.994 13:43:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.560 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:51.818 [2024-07-12 13:43:40.241021] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.818 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.077 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.077 "name": "Existed_Raid", 00:17:52.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.077 "strip_size_kb": 64, 00:17:52.077 "state": "configuring", 00:17:52.077 "raid_level": "raid0", 00:17:52.077 "superblock": false, 00:17:52.077 "num_base_bdevs": 4, 00:17:52.077 "num_base_bdevs_discovered": 2, 00:17:52.077 "num_base_bdevs_operational": 4, 00:17:52.077 "base_bdevs_list": [ 00:17:52.077 { 00:17:52.077 "name": "BaseBdev1", 00:17:52.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.077 "is_configured": false, 00:17:52.077 "data_offset": 0, 00:17:52.077 "data_size": 0 00:17:52.077 }, 00:17:52.077 { 00:17:52.077 "name": null, 00:17:52.077 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:17:52.077 "is_configured": false, 00:17:52.077 "data_offset": 0, 00:17:52.077 "data_size": 65536 00:17:52.077 }, 00:17:52.077 { 00:17:52.077 "name": "BaseBdev3", 00:17:52.077 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:17:52.077 "is_configured": true, 00:17:52.077 "data_offset": 0, 00:17:52.077 "data_size": 65536 00:17:52.077 }, 00:17:52.077 { 00:17:52.077 "name": "BaseBdev4", 00:17:52.077 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:17:52.077 "is_configured": true, 00:17:52.077 "data_offset": 0, 00:17:52.077 "data_size": 65536 00:17:52.077 } 00:17:52.077 ] 00:17:52.077 }' 00:17:52.077 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.077 13:43:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.644 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.644 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:52.902 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:52.902 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:53.160 [2024-07-12 13:43:41.581208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:53.160 BaseBdev1 00:17:53.161 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:53.161 13:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:53.161 13:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:53.161 13:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:53.161 13:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:53.161 13:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:53.161 13:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:53.728 13:43:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:53.986 [ 00:17:53.986 { 00:17:53.986 "name": "BaseBdev1", 00:17:53.986 "aliases": [ 00:17:53.986 "4a00fba9-05aa-478e-a1b6-5d544ec0c8be" 00:17:53.986 ], 00:17:53.986 "product_name": "Malloc disk", 00:17:53.986 "block_size": 512, 00:17:53.986 "num_blocks": 65536, 00:17:53.986 "uuid": "4a00fba9-05aa-478e-a1b6-5d544ec0c8be", 00:17:53.986 "assigned_rate_limits": { 00:17:53.986 "rw_ios_per_sec": 0, 00:17:53.986 "rw_mbytes_per_sec": 0, 00:17:53.986 "r_mbytes_per_sec": 0, 00:17:53.986 "w_mbytes_per_sec": 0 00:17:53.986 }, 00:17:53.986 "claimed": true, 00:17:53.986 "claim_type": "exclusive_write", 00:17:53.986 "zoned": false, 00:17:53.986 "supported_io_types": { 00:17:53.986 "read": true, 00:17:53.986 "write": true, 00:17:53.986 "unmap": true, 00:17:53.986 "flush": true, 00:17:53.986 "reset": true, 00:17:53.986 "nvme_admin": false, 00:17:53.986 "nvme_io": false, 00:17:53.986 "nvme_io_md": false, 00:17:53.986 "write_zeroes": true, 00:17:53.986 "zcopy": true, 00:17:53.986 "get_zone_info": false, 00:17:53.986 "zone_management": false, 00:17:53.986 "zone_append": false, 00:17:53.986 "compare": false, 00:17:53.986 "compare_and_write": false, 00:17:53.986 "abort": true, 00:17:53.986 "seek_hole": false, 00:17:53.986 "seek_data": false, 00:17:53.986 "copy": true, 00:17:53.986 "nvme_iov_md": false 00:17:53.987 }, 00:17:53.987 "memory_domains": [ 00:17:53.987 { 00:17:53.987 "dma_device_id": "system", 00:17:53.987 "dma_device_type": 1 00:17:53.987 }, 00:17:53.987 { 00:17:53.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.987 "dma_device_type": 2 00:17:53.987 } 00:17:53.987 ], 00:17:53.987 "driver_specific": {} 00:17:53.987 } 00:17:53.987 ] 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.987 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.245 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.245 "name": "Existed_Raid", 00:17:54.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.245 "strip_size_kb": 64, 00:17:54.245 "state": "configuring", 00:17:54.245 "raid_level": "raid0", 00:17:54.245 "superblock": false, 00:17:54.245 "num_base_bdevs": 4, 00:17:54.245 "num_base_bdevs_discovered": 3, 00:17:54.245 "num_base_bdevs_operational": 4, 00:17:54.245 "base_bdevs_list": [ 00:17:54.245 { 00:17:54.245 "name": "BaseBdev1", 00:17:54.245 "uuid": "4a00fba9-05aa-478e-a1b6-5d544ec0c8be", 00:17:54.245 "is_configured": true, 00:17:54.245 "data_offset": 0, 00:17:54.245 "data_size": 65536 00:17:54.245 }, 00:17:54.245 { 00:17:54.245 "name": null, 00:17:54.245 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:17:54.245 "is_configured": false, 00:17:54.245 "data_offset": 0, 00:17:54.245 "data_size": 65536 00:17:54.245 }, 00:17:54.245 { 00:17:54.245 "name": "BaseBdev3", 00:17:54.245 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:17:54.245 "is_configured": true, 00:17:54.245 "data_offset": 0, 00:17:54.245 "data_size": 65536 00:17:54.245 }, 00:17:54.245 { 00:17:54.245 "name": "BaseBdev4", 00:17:54.245 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:17:54.245 "is_configured": true, 00:17:54.245 "data_offset": 0, 00:17:54.245 "data_size": 65536 00:17:54.245 } 00:17:54.245 ] 00:17:54.245 }' 00:17:54.245 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.245 13:43:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.816 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:54.816 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.816 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:54.816 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:55.383 [2024-07-12 13:43:43.831230] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.383 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:55.642 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.642 "name": "Existed_Raid", 00:17:55.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.642 "strip_size_kb": 64, 00:17:55.642 "state": "configuring", 00:17:55.642 "raid_level": "raid0", 00:17:55.642 "superblock": false, 00:17:55.642 "num_base_bdevs": 4, 00:17:55.642 "num_base_bdevs_discovered": 2, 00:17:55.642 "num_base_bdevs_operational": 4, 00:17:55.642 "base_bdevs_list": [ 00:17:55.642 { 00:17:55.642 "name": "BaseBdev1", 00:17:55.642 "uuid": "4a00fba9-05aa-478e-a1b6-5d544ec0c8be", 00:17:55.642 "is_configured": true, 00:17:55.642 "data_offset": 0, 00:17:55.642 "data_size": 65536 00:17:55.642 }, 00:17:55.642 { 00:17:55.642 "name": null, 00:17:55.642 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:17:55.642 "is_configured": false, 00:17:55.642 "data_offset": 0, 00:17:55.642 "data_size": 65536 00:17:55.642 }, 00:17:55.642 { 00:17:55.642 "name": null, 00:17:55.642 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:17:55.642 "is_configured": false, 00:17:55.642 "data_offset": 0, 00:17:55.642 "data_size": 65536 00:17:55.642 }, 00:17:55.642 { 00:17:55.642 "name": "BaseBdev4", 00:17:55.642 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:17:55.642 "is_configured": true, 00:17:55.642 "data_offset": 0, 00:17:55.642 "data_size": 65536 00:17:55.642 } 00:17:55.642 ] 00:17:55.642 }' 00:17:55.642 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.642 13:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.209 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.209 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:56.468 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:56.468 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:57.036 [2024-07-12 13:43:45.367321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.036 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.300 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.300 "name": "Existed_Raid", 00:17:57.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.300 "strip_size_kb": 64, 00:17:57.300 "state": "configuring", 00:17:57.300 "raid_level": "raid0", 00:17:57.301 "superblock": false, 00:17:57.301 "num_base_bdevs": 4, 00:17:57.301 "num_base_bdevs_discovered": 3, 00:17:57.301 "num_base_bdevs_operational": 4, 00:17:57.301 "base_bdevs_list": [ 00:17:57.301 { 00:17:57.301 "name": "BaseBdev1", 00:17:57.301 "uuid": "4a00fba9-05aa-478e-a1b6-5d544ec0c8be", 00:17:57.301 "is_configured": true, 00:17:57.301 "data_offset": 0, 00:17:57.301 "data_size": 65536 00:17:57.301 }, 00:17:57.301 { 00:17:57.301 "name": null, 00:17:57.301 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:17:57.301 "is_configured": false, 00:17:57.301 "data_offset": 0, 00:17:57.301 "data_size": 65536 00:17:57.301 }, 00:17:57.301 { 00:17:57.301 "name": "BaseBdev3", 00:17:57.301 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:17:57.301 "is_configured": true, 00:17:57.301 "data_offset": 0, 00:17:57.301 "data_size": 65536 00:17:57.301 }, 00:17:57.301 { 00:17:57.301 "name": "BaseBdev4", 00:17:57.301 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:17:57.301 "is_configured": true, 00:17:57.301 "data_offset": 0, 00:17:57.301 "data_size": 65536 00:17:57.301 } 00:17:57.301 ] 00:17:57.301 }' 00:17:57.301 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.301 13:43:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.868 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.868 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:58.128 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:58.128 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:58.388 [2024-07-12 13:43:46.943511] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.646 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.904 13:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.904 "name": "Existed_Raid", 00:17:58.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.904 "strip_size_kb": 64, 00:17:58.904 "state": "configuring", 00:17:58.904 "raid_level": "raid0", 00:17:58.904 "superblock": false, 00:17:58.904 "num_base_bdevs": 4, 00:17:58.904 "num_base_bdevs_discovered": 2, 00:17:58.904 "num_base_bdevs_operational": 4, 00:17:58.904 "base_bdevs_list": [ 00:17:58.904 { 00:17:58.904 "name": null, 00:17:58.904 "uuid": "4a00fba9-05aa-478e-a1b6-5d544ec0c8be", 00:17:58.904 "is_configured": false, 00:17:58.904 "data_offset": 0, 00:17:58.904 "data_size": 65536 00:17:58.904 }, 00:17:58.904 { 00:17:58.904 "name": null, 00:17:58.904 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:17:58.904 "is_configured": false, 00:17:58.904 "data_offset": 0, 00:17:58.904 "data_size": 65536 00:17:58.904 }, 00:17:58.904 { 00:17:58.904 "name": "BaseBdev3", 00:17:58.904 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:17:58.904 "is_configured": true, 00:17:58.904 "data_offset": 0, 00:17:58.904 "data_size": 65536 00:17:58.904 }, 00:17:58.904 { 00:17:58.904 "name": "BaseBdev4", 00:17:58.904 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:17:58.904 "is_configured": true, 00:17:58.904 "data_offset": 0, 00:17:58.904 "data_size": 65536 00:17:58.904 } 00:17:58.904 ] 00:17:58.904 }' 00:17:58.904 13:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.904 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.472 13:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.472 13:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:59.731 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:59.731 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:59.989 [2024-07-12 13:43:48.540068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.246 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.246 "name": "Existed_Raid", 00:18:00.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.246 "strip_size_kb": 64, 00:18:00.246 "state": "configuring", 00:18:00.246 "raid_level": "raid0", 00:18:00.246 "superblock": false, 00:18:00.246 "num_base_bdevs": 4, 00:18:00.246 "num_base_bdevs_discovered": 3, 00:18:00.246 "num_base_bdevs_operational": 4, 00:18:00.246 "base_bdevs_list": [ 00:18:00.246 { 00:18:00.246 "name": null, 00:18:00.246 "uuid": "4a00fba9-05aa-478e-a1b6-5d544ec0c8be", 00:18:00.246 "is_configured": false, 00:18:00.246 "data_offset": 0, 00:18:00.246 "data_size": 65536 00:18:00.246 }, 00:18:00.246 { 00:18:00.246 "name": "BaseBdev2", 00:18:00.246 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:18:00.246 "is_configured": true, 00:18:00.247 "data_offset": 0, 00:18:00.247 "data_size": 65536 00:18:00.247 }, 00:18:00.247 { 00:18:00.247 "name": "BaseBdev3", 00:18:00.247 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:18:00.247 "is_configured": true, 00:18:00.247 "data_offset": 0, 00:18:00.247 "data_size": 65536 00:18:00.247 }, 00:18:00.247 { 00:18:00.247 "name": "BaseBdev4", 00:18:00.247 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:18:00.247 "is_configured": true, 00:18:00.247 "data_offset": 0, 00:18:00.247 "data_size": 65536 00:18:00.247 } 00:18:00.247 ] 00:18:00.247 }' 00:18:00.247 13:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.247 13:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:01.181 13:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.181 13:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:01.181 13:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:01.181 13:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.181 13:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:01.439 13:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4a00fba9-05aa-478e-a1b6-5d544ec0c8be 00:18:02.007 [2024-07-12 13:43:50.417748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:02.007 [2024-07-12 13:43:50.417790] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cfca20 00:18:02.007 [2024-07-12 13:43:50.417799] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:02.007 [2024-07-12 13:43:50.418007] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf8320 00:18:02.007 [2024-07-12 13:43:50.418132] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cfca20 00:18:02.007 [2024-07-12 13:43:50.418143] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cfca20 00:18:02.007 [2024-07-12 13:43:50.418313] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:02.007 NewBaseBdev 00:18:02.007 13:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:02.007 13:43:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:02.007 13:43:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:02.007 13:43:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:02.007 13:43:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:02.007 13:43:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:02.007 13:43:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:02.574 13:43:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:03.142 [ 00:18:03.142 { 00:18:03.142 "name": "NewBaseBdev", 00:18:03.142 "aliases": [ 00:18:03.142 "4a00fba9-05aa-478e-a1b6-5d544ec0c8be" 00:18:03.142 ], 00:18:03.142 "product_name": "Malloc disk", 00:18:03.142 "block_size": 512, 00:18:03.142 "num_blocks": 65536, 00:18:03.142 "uuid": "4a00fba9-05aa-478e-a1b6-5d544ec0c8be", 00:18:03.142 "assigned_rate_limits": { 00:18:03.142 "rw_ios_per_sec": 0, 00:18:03.142 "rw_mbytes_per_sec": 0, 00:18:03.142 "r_mbytes_per_sec": 0, 00:18:03.142 "w_mbytes_per_sec": 0 00:18:03.142 }, 00:18:03.142 "claimed": true, 00:18:03.142 "claim_type": "exclusive_write", 00:18:03.142 "zoned": false, 00:18:03.142 "supported_io_types": { 00:18:03.142 "read": true, 00:18:03.142 "write": true, 00:18:03.142 "unmap": true, 00:18:03.142 "flush": true, 00:18:03.142 "reset": true, 00:18:03.142 "nvme_admin": false, 00:18:03.142 "nvme_io": false, 00:18:03.142 "nvme_io_md": false, 00:18:03.142 "write_zeroes": true, 00:18:03.142 "zcopy": true, 00:18:03.142 "get_zone_info": false, 00:18:03.142 "zone_management": false, 00:18:03.142 "zone_append": false, 00:18:03.142 "compare": false, 00:18:03.142 "compare_and_write": false, 00:18:03.142 "abort": true, 00:18:03.142 "seek_hole": false, 00:18:03.142 "seek_data": false, 00:18:03.143 "copy": true, 00:18:03.143 "nvme_iov_md": false 00:18:03.143 }, 00:18:03.143 "memory_domains": [ 00:18:03.143 { 00:18:03.143 "dma_device_id": "system", 00:18:03.143 "dma_device_type": 1 00:18:03.143 }, 00:18:03.143 { 00:18:03.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.143 "dma_device_type": 2 00:18:03.143 } 00:18:03.143 ], 00:18:03.143 "driver_specific": {} 00:18:03.143 } 00:18:03.143 ] 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.143 "name": "Existed_Raid", 00:18:03.143 "uuid": "31611887-96f5-41e1-a860-68f0d9cff0d1", 00:18:03.143 "strip_size_kb": 64, 00:18:03.143 "state": "online", 00:18:03.143 "raid_level": "raid0", 00:18:03.143 "superblock": false, 00:18:03.143 "num_base_bdevs": 4, 00:18:03.143 "num_base_bdevs_discovered": 4, 00:18:03.143 "num_base_bdevs_operational": 4, 00:18:03.143 "base_bdevs_list": [ 00:18:03.143 { 00:18:03.143 "name": "NewBaseBdev", 00:18:03.143 "uuid": "4a00fba9-05aa-478e-a1b6-5d544ec0c8be", 00:18:03.143 "is_configured": true, 00:18:03.143 "data_offset": 0, 00:18:03.143 "data_size": 65536 00:18:03.143 }, 00:18:03.143 { 00:18:03.143 "name": "BaseBdev2", 00:18:03.143 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:18:03.143 "is_configured": true, 00:18:03.143 "data_offset": 0, 00:18:03.143 "data_size": 65536 00:18:03.143 }, 00:18:03.143 { 00:18:03.143 "name": "BaseBdev3", 00:18:03.143 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:18:03.143 "is_configured": true, 00:18:03.143 "data_offset": 0, 00:18:03.143 "data_size": 65536 00:18:03.143 }, 00:18:03.143 { 00:18:03.143 "name": "BaseBdev4", 00:18:03.143 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:18:03.143 "is_configured": true, 00:18:03.143 "data_offset": 0, 00:18:03.143 "data_size": 65536 00:18:03.143 } 00:18:03.143 ] 00:18:03.143 }' 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.143 13:43:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:04.080 [2024-07-12 13:43:52.571792] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:04.080 "name": "Existed_Raid", 00:18:04.080 "aliases": [ 00:18:04.080 "31611887-96f5-41e1-a860-68f0d9cff0d1" 00:18:04.080 ], 00:18:04.080 "product_name": "Raid Volume", 00:18:04.080 "block_size": 512, 00:18:04.080 "num_blocks": 262144, 00:18:04.080 "uuid": "31611887-96f5-41e1-a860-68f0d9cff0d1", 00:18:04.080 "assigned_rate_limits": { 00:18:04.080 "rw_ios_per_sec": 0, 00:18:04.080 "rw_mbytes_per_sec": 0, 00:18:04.080 "r_mbytes_per_sec": 0, 00:18:04.080 "w_mbytes_per_sec": 0 00:18:04.080 }, 00:18:04.080 "claimed": false, 00:18:04.080 "zoned": false, 00:18:04.080 "supported_io_types": { 00:18:04.080 "read": true, 00:18:04.080 "write": true, 00:18:04.080 "unmap": true, 00:18:04.080 "flush": true, 00:18:04.080 "reset": true, 00:18:04.080 "nvme_admin": false, 00:18:04.080 "nvme_io": false, 00:18:04.080 "nvme_io_md": false, 00:18:04.080 "write_zeroes": true, 00:18:04.080 "zcopy": false, 00:18:04.080 "get_zone_info": false, 00:18:04.080 "zone_management": false, 00:18:04.080 "zone_append": false, 00:18:04.080 "compare": false, 00:18:04.080 "compare_and_write": false, 00:18:04.080 "abort": false, 00:18:04.080 "seek_hole": false, 00:18:04.080 "seek_data": false, 00:18:04.080 "copy": false, 00:18:04.080 "nvme_iov_md": false 00:18:04.080 }, 00:18:04.080 "memory_domains": [ 00:18:04.080 { 00:18:04.080 "dma_device_id": "system", 00:18:04.080 "dma_device_type": 1 00:18:04.080 }, 00:18:04.080 { 00:18:04.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.080 "dma_device_type": 2 00:18:04.080 }, 00:18:04.080 { 00:18:04.080 "dma_device_id": "system", 00:18:04.080 "dma_device_type": 1 00:18:04.080 }, 00:18:04.080 { 00:18:04.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.080 "dma_device_type": 2 00:18:04.080 }, 00:18:04.080 { 00:18:04.080 "dma_device_id": "system", 00:18:04.080 "dma_device_type": 1 00:18:04.080 }, 00:18:04.080 { 00:18:04.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.080 "dma_device_type": 2 00:18:04.080 }, 00:18:04.080 { 00:18:04.080 "dma_device_id": "system", 00:18:04.080 "dma_device_type": 1 00:18:04.080 }, 00:18:04.080 { 00:18:04.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.080 "dma_device_type": 2 00:18:04.080 } 00:18:04.080 ], 00:18:04.080 "driver_specific": { 00:18:04.080 "raid": { 00:18:04.080 "uuid": "31611887-96f5-41e1-a860-68f0d9cff0d1", 00:18:04.080 "strip_size_kb": 64, 00:18:04.080 "state": "online", 00:18:04.080 "raid_level": "raid0", 00:18:04.080 "superblock": false, 00:18:04.080 "num_base_bdevs": 4, 00:18:04.080 "num_base_bdevs_discovered": 4, 00:18:04.080 "num_base_bdevs_operational": 4, 00:18:04.080 "base_bdevs_list": [ 00:18:04.080 { 00:18:04.080 "name": "NewBaseBdev", 00:18:04.080 "uuid": "4a00fba9-05aa-478e-a1b6-5d544ec0c8be", 00:18:04.080 "is_configured": true, 00:18:04.080 "data_offset": 0, 00:18:04.080 "data_size": 65536 00:18:04.080 }, 00:18:04.080 { 00:18:04.080 "name": "BaseBdev2", 00:18:04.080 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:18:04.080 "is_configured": true, 00:18:04.080 "data_offset": 0, 00:18:04.080 "data_size": 65536 00:18:04.080 }, 00:18:04.080 { 00:18:04.080 "name": "BaseBdev3", 00:18:04.080 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:18:04.080 "is_configured": true, 00:18:04.080 "data_offset": 0, 00:18:04.080 "data_size": 65536 00:18:04.080 }, 00:18:04.080 { 00:18:04.080 "name": "BaseBdev4", 00:18:04.080 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:18:04.080 "is_configured": true, 00:18:04.080 "data_offset": 0, 00:18:04.080 "data_size": 65536 00:18:04.080 } 00:18:04.080 ] 00:18:04.080 } 00:18:04.080 } 00:18:04.080 }' 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:04.080 BaseBdev2 00:18:04.080 BaseBdev3 00:18:04.080 BaseBdev4' 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:04.080 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:04.338 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:04.338 "name": "NewBaseBdev", 00:18:04.338 "aliases": [ 00:18:04.338 "4a00fba9-05aa-478e-a1b6-5d544ec0c8be" 00:18:04.338 ], 00:18:04.338 "product_name": "Malloc disk", 00:18:04.338 "block_size": 512, 00:18:04.338 "num_blocks": 65536, 00:18:04.338 "uuid": "4a00fba9-05aa-478e-a1b6-5d544ec0c8be", 00:18:04.338 "assigned_rate_limits": { 00:18:04.338 "rw_ios_per_sec": 0, 00:18:04.338 "rw_mbytes_per_sec": 0, 00:18:04.338 "r_mbytes_per_sec": 0, 00:18:04.338 "w_mbytes_per_sec": 0 00:18:04.338 }, 00:18:04.338 "claimed": true, 00:18:04.338 "claim_type": "exclusive_write", 00:18:04.338 "zoned": false, 00:18:04.338 "supported_io_types": { 00:18:04.338 "read": true, 00:18:04.338 "write": true, 00:18:04.338 "unmap": true, 00:18:04.338 "flush": true, 00:18:04.338 "reset": true, 00:18:04.338 "nvme_admin": false, 00:18:04.338 "nvme_io": false, 00:18:04.338 "nvme_io_md": false, 00:18:04.338 "write_zeroes": true, 00:18:04.338 "zcopy": true, 00:18:04.338 "get_zone_info": false, 00:18:04.338 "zone_management": false, 00:18:04.338 "zone_append": false, 00:18:04.338 "compare": false, 00:18:04.338 "compare_and_write": false, 00:18:04.338 "abort": true, 00:18:04.338 "seek_hole": false, 00:18:04.338 "seek_data": false, 00:18:04.338 "copy": true, 00:18:04.338 "nvme_iov_md": false 00:18:04.338 }, 00:18:04.338 "memory_domains": [ 00:18:04.338 { 00:18:04.338 "dma_device_id": "system", 00:18:04.338 "dma_device_type": 1 00:18:04.338 }, 00:18:04.338 { 00:18:04.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.338 "dma_device_type": 2 00:18:04.338 } 00:18:04.338 ], 00:18:04.338 "driver_specific": {} 00:18:04.338 }' 00:18:04.338 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:04.596 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:04.596 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:04.596 13:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:04.596 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:04.596 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:04.596 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:04.596 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:04.596 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:04.596 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:04.856 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:04.856 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:04.856 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:04.856 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:04.856 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:05.116 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:05.116 "name": "BaseBdev2", 00:18:05.116 "aliases": [ 00:18:05.116 "eddcc6a7-98cc-4548-983d-ffc355164718" 00:18:05.116 ], 00:18:05.116 "product_name": "Malloc disk", 00:18:05.116 "block_size": 512, 00:18:05.116 "num_blocks": 65536, 00:18:05.116 "uuid": "eddcc6a7-98cc-4548-983d-ffc355164718", 00:18:05.116 "assigned_rate_limits": { 00:18:05.116 "rw_ios_per_sec": 0, 00:18:05.116 "rw_mbytes_per_sec": 0, 00:18:05.116 "r_mbytes_per_sec": 0, 00:18:05.116 "w_mbytes_per_sec": 0 00:18:05.116 }, 00:18:05.116 "claimed": true, 00:18:05.116 "claim_type": "exclusive_write", 00:18:05.116 "zoned": false, 00:18:05.116 "supported_io_types": { 00:18:05.116 "read": true, 00:18:05.116 "write": true, 00:18:05.116 "unmap": true, 00:18:05.116 "flush": true, 00:18:05.116 "reset": true, 00:18:05.116 "nvme_admin": false, 00:18:05.116 "nvme_io": false, 00:18:05.116 "nvme_io_md": false, 00:18:05.116 "write_zeroes": true, 00:18:05.116 "zcopy": true, 00:18:05.116 "get_zone_info": false, 00:18:05.116 "zone_management": false, 00:18:05.116 "zone_append": false, 00:18:05.116 "compare": false, 00:18:05.116 "compare_and_write": false, 00:18:05.116 "abort": true, 00:18:05.116 "seek_hole": false, 00:18:05.116 "seek_data": false, 00:18:05.116 "copy": true, 00:18:05.116 "nvme_iov_md": false 00:18:05.116 }, 00:18:05.116 "memory_domains": [ 00:18:05.116 { 00:18:05.116 "dma_device_id": "system", 00:18:05.116 "dma_device_type": 1 00:18:05.116 }, 00:18:05.116 { 00:18:05.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.116 "dma_device_type": 2 00:18:05.116 } 00:18:05.116 ], 00:18:05.116 "driver_specific": {} 00:18:05.116 }' 00:18:05.116 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.116 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.116 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:05.116 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.116 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.116 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:05.116 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.375 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.375 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:05.375 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.375 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.375 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:05.375 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:05.375 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:05.375 13:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:05.635 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:05.635 "name": "BaseBdev3", 00:18:05.635 "aliases": [ 00:18:05.635 "410b672f-7a0c-4ef2-8fe6-b44efcc5985c" 00:18:05.635 ], 00:18:05.635 "product_name": "Malloc disk", 00:18:05.635 "block_size": 512, 00:18:05.635 "num_blocks": 65536, 00:18:05.635 "uuid": "410b672f-7a0c-4ef2-8fe6-b44efcc5985c", 00:18:05.635 "assigned_rate_limits": { 00:18:05.635 "rw_ios_per_sec": 0, 00:18:05.635 "rw_mbytes_per_sec": 0, 00:18:05.635 "r_mbytes_per_sec": 0, 00:18:05.635 "w_mbytes_per_sec": 0 00:18:05.635 }, 00:18:05.635 "claimed": true, 00:18:05.635 "claim_type": "exclusive_write", 00:18:05.635 "zoned": false, 00:18:05.635 "supported_io_types": { 00:18:05.635 "read": true, 00:18:05.635 "write": true, 00:18:05.635 "unmap": true, 00:18:05.635 "flush": true, 00:18:05.635 "reset": true, 00:18:05.635 "nvme_admin": false, 00:18:05.635 "nvme_io": false, 00:18:05.635 "nvme_io_md": false, 00:18:05.635 "write_zeroes": true, 00:18:05.635 "zcopy": true, 00:18:05.635 "get_zone_info": false, 00:18:05.635 "zone_management": false, 00:18:05.635 "zone_append": false, 00:18:05.635 "compare": false, 00:18:05.635 "compare_and_write": false, 00:18:05.635 "abort": true, 00:18:05.635 "seek_hole": false, 00:18:05.635 "seek_data": false, 00:18:05.635 "copy": true, 00:18:05.635 "nvme_iov_md": false 00:18:05.635 }, 00:18:05.635 "memory_domains": [ 00:18:05.635 { 00:18:05.635 "dma_device_id": "system", 00:18:05.635 "dma_device_type": 1 00:18:05.635 }, 00:18:05.635 { 00:18:05.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.635 "dma_device_type": 2 00:18:05.635 } 00:18:05.635 ], 00:18:05.635 "driver_specific": {} 00:18:05.635 }' 00:18:05.635 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.635 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:05.635 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:05.635 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.635 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:05.894 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:05.894 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.894 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:05.894 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:05.894 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.894 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:05.894 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:05.894 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:05.894 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:05.894 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:06.153 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:06.153 "name": "BaseBdev4", 00:18:06.153 "aliases": [ 00:18:06.153 "7a17fbf4-75e5-4592-b956-32e33331bb2f" 00:18:06.153 ], 00:18:06.153 "product_name": "Malloc disk", 00:18:06.153 "block_size": 512, 00:18:06.153 "num_blocks": 65536, 00:18:06.153 "uuid": "7a17fbf4-75e5-4592-b956-32e33331bb2f", 00:18:06.153 "assigned_rate_limits": { 00:18:06.153 "rw_ios_per_sec": 0, 00:18:06.153 "rw_mbytes_per_sec": 0, 00:18:06.153 "r_mbytes_per_sec": 0, 00:18:06.153 "w_mbytes_per_sec": 0 00:18:06.153 }, 00:18:06.153 "claimed": true, 00:18:06.153 "claim_type": "exclusive_write", 00:18:06.153 "zoned": false, 00:18:06.153 "supported_io_types": { 00:18:06.153 "read": true, 00:18:06.153 "write": true, 00:18:06.153 "unmap": true, 00:18:06.153 "flush": true, 00:18:06.153 "reset": true, 00:18:06.153 "nvme_admin": false, 00:18:06.153 "nvme_io": false, 00:18:06.153 "nvme_io_md": false, 00:18:06.153 "write_zeroes": true, 00:18:06.153 "zcopy": true, 00:18:06.153 "get_zone_info": false, 00:18:06.153 "zone_management": false, 00:18:06.153 "zone_append": false, 00:18:06.153 "compare": false, 00:18:06.153 "compare_and_write": false, 00:18:06.153 "abort": true, 00:18:06.153 "seek_hole": false, 00:18:06.153 "seek_data": false, 00:18:06.153 "copy": true, 00:18:06.153 "nvme_iov_md": false 00:18:06.153 }, 00:18:06.153 "memory_domains": [ 00:18:06.153 { 00:18:06.153 "dma_device_id": "system", 00:18:06.153 "dma_device_type": 1 00:18:06.153 }, 00:18:06.153 { 00:18:06.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.153 "dma_device_type": 2 00:18:06.153 } 00:18:06.153 ], 00:18:06.153 "driver_specific": {} 00:18:06.153 }' 00:18:06.153 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.153 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.412 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:06.412 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.412 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.412 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:06.412 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:06.412 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:06.412 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:06.412 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:06.412 13:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:06.671 13:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:06.671 13:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:06.671 [2024-07-12 13:43:55.234735] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:06.671 [2024-07-12 13:43:55.234764] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:06.671 [2024-07-12 13:43:55.234822] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:06.671 [2024-07-12 13:43:55.234883] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:06.671 [2024-07-12 13:43:55.234895] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cfca20 name Existed_Raid, state offline 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 489775 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 489775 ']' 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 489775 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 489775 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 489775' 00:18:06.930 killing process with pid 489775 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 489775 00:18:06.930 [2024-07-12 13:43:55.300722] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:06.930 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 489775 00:18:06.930 [2024-07-12 13:43:55.339740] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:07.191 00:18:07.191 real 0m35.293s 00:18:07.191 user 1m4.960s 00:18:07.191 sys 0m6.101s 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.191 ************************************ 00:18:07.191 END TEST raid_state_function_test 00:18:07.191 ************************************ 00:18:07.191 13:43:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:07.191 13:43:55 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:07.191 13:43:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:07.191 13:43:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:07.191 13:43:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:07.191 ************************************ 00:18:07.191 START TEST raid_state_function_test_sb 00:18:07.191 ************************************ 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=495009 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 495009' 00:18:07.191 Process raid pid: 495009 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 495009 /var/tmp/spdk-raid.sock 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 495009 ']' 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:07.191 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:07.191 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:07.191 [2024-07-12 13:43:55.728381] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:18:07.191 [2024-07-12 13:43:55.728454] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:07.451 [2024-07-12 13:43:55.860085] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:07.451 [2024-07-12 13:43:55.961636] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:07.451 [2024-07-12 13:43:56.027144] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:07.451 [2024-07-12 13:43:56.027178] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:08.388 [2024-07-12 13:43:56.879000] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:08.388 [2024-07-12 13:43:56.879042] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:08.388 [2024-07-12 13:43:56.879053] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:08.388 [2024-07-12 13:43:56.879065] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:08.388 [2024-07-12 13:43:56.879074] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:08.388 [2024-07-12 13:43:56.879085] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:08.388 [2024-07-12 13:43:56.879094] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:08.388 [2024-07-12 13:43:56.879106] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.388 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.648 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.648 "name": "Existed_Raid", 00:18:08.648 "uuid": "85c6f67d-85c4-4024-af0b-b043d2e4d3d2", 00:18:08.648 "strip_size_kb": 64, 00:18:08.648 "state": "configuring", 00:18:08.648 "raid_level": "raid0", 00:18:08.648 "superblock": true, 00:18:08.648 "num_base_bdevs": 4, 00:18:08.648 "num_base_bdevs_discovered": 0, 00:18:08.648 "num_base_bdevs_operational": 4, 00:18:08.648 "base_bdevs_list": [ 00:18:08.648 { 00:18:08.648 "name": "BaseBdev1", 00:18:08.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.648 "is_configured": false, 00:18:08.648 "data_offset": 0, 00:18:08.648 "data_size": 0 00:18:08.648 }, 00:18:08.648 { 00:18:08.648 "name": "BaseBdev2", 00:18:08.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.648 "is_configured": false, 00:18:08.648 "data_offset": 0, 00:18:08.648 "data_size": 0 00:18:08.648 }, 00:18:08.648 { 00:18:08.648 "name": "BaseBdev3", 00:18:08.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.648 "is_configured": false, 00:18:08.648 "data_offset": 0, 00:18:08.648 "data_size": 0 00:18:08.648 }, 00:18:08.648 { 00:18:08.648 "name": "BaseBdev4", 00:18:08.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.648 "is_configured": false, 00:18:08.648 "data_offset": 0, 00:18:08.648 "data_size": 0 00:18:08.648 } 00:18:08.648 ] 00:18:08.648 }' 00:18:08.648 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.648 13:43:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:09.216 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:09.475 [2024-07-12 13:43:57.929820] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:09.475 [2024-07-12 13:43:57.929855] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x198a370 name Existed_Raid, state configuring 00:18:09.475 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:09.735 [2024-07-12 13:43:58.178528] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:09.735 [2024-07-12 13:43:58.178557] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:09.735 [2024-07-12 13:43:58.178567] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:09.735 [2024-07-12 13:43:58.178579] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:09.735 [2024-07-12 13:43:58.178588] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:09.735 [2024-07-12 13:43:58.178599] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:09.735 [2024-07-12 13:43:58.178608] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:09.735 [2024-07-12 13:43:58.178619] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:09.735 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:09.993 [2024-07-12 13:43:58.433144] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:09.993 BaseBdev1 00:18:09.993 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:09.993 13:43:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:09.993 13:43:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:09.993 13:43:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:09.993 13:43:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:09.993 13:43:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:09.993 13:43:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:10.252 13:43:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:10.512 [ 00:18:10.512 { 00:18:10.512 "name": "BaseBdev1", 00:18:10.512 "aliases": [ 00:18:10.512 "423f1ae8-2be0-434e-8bea-b4f38bf5e84f" 00:18:10.512 ], 00:18:10.512 "product_name": "Malloc disk", 00:18:10.512 "block_size": 512, 00:18:10.512 "num_blocks": 65536, 00:18:10.512 "uuid": "423f1ae8-2be0-434e-8bea-b4f38bf5e84f", 00:18:10.512 "assigned_rate_limits": { 00:18:10.512 "rw_ios_per_sec": 0, 00:18:10.512 "rw_mbytes_per_sec": 0, 00:18:10.512 "r_mbytes_per_sec": 0, 00:18:10.512 "w_mbytes_per_sec": 0 00:18:10.512 }, 00:18:10.512 "claimed": true, 00:18:10.512 "claim_type": "exclusive_write", 00:18:10.512 "zoned": false, 00:18:10.512 "supported_io_types": { 00:18:10.512 "read": true, 00:18:10.512 "write": true, 00:18:10.512 "unmap": true, 00:18:10.512 "flush": true, 00:18:10.512 "reset": true, 00:18:10.512 "nvme_admin": false, 00:18:10.512 "nvme_io": false, 00:18:10.512 "nvme_io_md": false, 00:18:10.512 "write_zeroes": true, 00:18:10.512 "zcopy": true, 00:18:10.512 "get_zone_info": false, 00:18:10.512 "zone_management": false, 00:18:10.512 "zone_append": false, 00:18:10.512 "compare": false, 00:18:10.512 "compare_and_write": false, 00:18:10.512 "abort": true, 00:18:10.512 "seek_hole": false, 00:18:10.512 "seek_data": false, 00:18:10.512 "copy": true, 00:18:10.512 "nvme_iov_md": false 00:18:10.512 }, 00:18:10.512 "memory_domains": [ 00:18:10.512 { 00:18:10.512 "dma_device_id": "system", 00:18:10.512 "dma_device_type": 1 00:18:10.512 }, 00:18:10.512 { 00:18:10.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.512 "dma_device_type": 2 00:18:10.512 } 00:18:10.512 ], 00:18:10.512 "driver_specific": {} 00:18:10.512 } 00:18:10.512 ] 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.512 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.771 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.771 "name": "Existed_Raid", 00:18:10.771 "uuid": "aa262a27-70b7-4ce1-bcbf-c9b74b72d74f", 00:18:10.771 "strip_size_kb": 64, 00:18:10.771 "state": "configuring", 00:18:10.771 "raid_level": "raid0", 00:18:10.771 "superblock": true, 00:18:10.771 "num_base_bdevs": 4, 00:18:10.771 "num_base_bdevs_discovered": 1, 00:18:10.771 "num_base_bdevs_operational": 4, 00:18:10.771 "base_bdevs_list": [ 00:18:10.771 { 00:18:10.771 "name": "BaseBdev1", 00:18:10.771 "uuid": "423f1ae8-2be0-434e-8bea-b4f38bf5e84f", 00:18:10.771 "is_configured": true, 00:18:10.771 "data_offset": 2048, 00:18:10.771 "data_size": 63488 00:18:10.771 }, 00:18:10.771 { 00:18:10.771 "name": "BaseBdev2", 00:18:10.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.771 "is_configured": false, 00:18:10.771 "data_offset": 0, 00:18:10.771 "data_size": 0 00:18:10.771 }, 00:18:10.771 { 00:18:10.771 "name": "BaseBdev3", 00:18:10.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.771 "is_configured": false, 00:18:10.771 "data_offset": 0, 00:18:10.771 "data_size": 0 00:18:10.771 }, 00:18:10.771 { 00:18:10.771 "name": "BaseBdev4", 00:18:10.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.771 "is_configured": false, 00:18:10.771 "data_offset": 0, 00:18:10.771 "data_size": 0 00:18:10.771 } 00:18:10.771 ] 00:18:10.771 }' 00:18:10.771 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.771 13:43:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:11.339 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:11.598 [2024-07-12 13:43:59.993287] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:11.598 [2024-07-12 13:43:59.993330] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1989be0 name Existed_Raid, state configuring 00:18:11.598 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:11.856 [2024-07-12 13:44:00.246021] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:11.856 [2024-07-12 13:44:00.247459] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:11.856 [2024-07-12 13:44:00.247492] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:11.856 [2024-07-12 13:44:00.247504] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:11.856 [2024-07-12 13:44:00.247515] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:11.856 [2024-07-12 13:44:00.247524] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:11.856 [2024-07-12 13:44:00.247536] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.856 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.115 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.115 "name": "Existed_Raid", 00:18:12.115 "uuid": "7305f669-3754-4546-92af-205b0c59880e", 00:18:12.115 "strip_size_kb": 64, 00:18:12.115 "state": "configuring", 00:18:12.115 "raid_level": "raid0", 00:18:12.115 "superblock": true, 00:18:12.115 "num_base_bdevs": 4, 00:18:12.115 "num_base_bdevs_discovered": 1, 00:18:12.115 "num_base_bdevs_operational": 4, 00:18:12.115 "base_bdevs_list": [ 00:18:12.115 { 00:18:12.115 "name": "BaseBdev1", 00:18:12.115 "uuid": "423f1ae8-2be0-434e-8bea-b4f38bf5e84f", 00:18:12.115 "is_configured": true, 00:18:12.115 "data_offset": 2048, 00:18:12.115 "data_size": 63488 00:18:12.115 }, 00:18:12.115 { 00:18:12.115 "name": "BaseBdev2", 00:18:12.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:12.115 "is_configured": false, 00:18:12.115 "data_offset": 0, 00:18:12.115 "data_size": 0 00:18:12.115 }, 00:18:12.115 { 00:18:12.115 "name": "BaseBdev3", 00:18:12.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:12.115 "is_configured": false, 00:18:12.115 "data_offset": 0, 00:18:12.115 "data_size": 0 00:18:12.115 }, 00:18:12.115 { 00:18:12.115 "name": "BaseBdev4", 00:18:12.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:12.115 "is_configured": false, 00:18:12.115 "data_offset": 0, 00:18:12.115 "data_size": 0 00:18:12.115 } 00:18:12.115 ] 00:18:12.115 }' 00:18:12.115 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.115 13:44:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:12.684 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:12.944 [2024-07-12 13:44:01.272238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:12.944 BaseBdev2 00:18:12.944 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:12.944 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:12.944 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:12.944 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:12.944 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:12.944 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:12.944 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:13.203 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:13.203 [ 00:18:13.203 { 00:18:13.203 "name": "BaseBdev2", 00:18:13.203 "aliases": [ 00:18:13.203 "47c97ac2-94b0-40ee-a7e4-13c7450fabd2" 00:18:13.203 ], 00:18:13.203 "product_name": "Malloc disk", 00:18:13.203 "block_size": 512, 00:18:13.203 "num_blocks": 65536, 00:18:13.203 "uuid": "47c97ac2-94b0-40ee-a7e4-13c7450fabd2", 00:18:13.203 "assigned_rate_limits": { 00:18:13.203 "rw_ios_per_sec": 0, 00:18:13.203 "rw_mbytes_per_sec": 0, 00:18:13.203 "r_mbytes_per_sec": 0, 00:18:13.203 "w_mbytes_per_sec": 0 00:18:13.203 }, 00:18:13.203 "claimed": true, 00:18:13.203 "claim_type": "exclusive_write", 00:18:13.203 "zoned": false, 00:18:13.203 "supported_io_types": { 00:18:13.203 "read": true, 00:18:13.203 "write": true, 00:18:13.203 "unmap": true, 00:18:13.203 "flush": true, 00:18:13.203 "reset": true, 00:18:13.203 "nvme_admin": false, 00:18:13.203 "nvme_io": false, 00:18:13.203 "nvme_io_md": false, 00:18:13.203 "write_zeroes": true, 00:18:13.203 "zcopy": true, 00:18:13.203 "get_zone_info": false, 00:18:13.203 "zone_management": false, 00:18:13.203 "zone_append": false, 00:18:13.203 "compare": false, 00:18:13.203 "compare_and_write": false, 00:18:13.203 "abort": true, 00:18:13.203 "seek_hole": false, 00:18:13.203 "seek_data": false, 00:18:13.203 "copy": true, 00:18:13.203 "nvme_iov_md": false 00:18:13.203 }, 00:18:13.203 "memory_domains": [ 00:18:13.203 { 00:18:13.203 "dma_device_id": "system", 00:18:13.203 "dma_device_type": 1 00:18:13.203 }, 00:18:13.203 { 00:18:13.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.203 "dma_device_type": 2 00:18:13.203 } 00:18:13.203 ], 00:18:13.203 "driver_specific": {} 00:18:13.203 } 00:18:13.203 ] 00:18:13.461 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:13.461 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:13.461 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:13.461 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:13.461 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:13.461 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:13.462 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:13.462 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:13.462 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:13.462 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:13.462 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:13.462 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:13.462 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:13.462 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.462 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.721 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.721 "name": "Existed_Raid", 00:18:13.721 "uuid": "7305f669-3754-4546-92af-205b0c59880e", 00:18:13.721 "strip_size_kb": 64, 00:18:13.721 "state": "configuring", 00:18:13.721 "raid_level": "raid0", 00:18:13.721 "superblock": true, 00:18:13.721 "num_base_bdevs": 4, 00:18:13.721 "num_base_bdevs_discovered": 2, 00:18:13.721 "num_base_bdevs_operational": 4, 00:18:13.721 "base_bdevs_list": [ 00:18:13.721 { 00:18:13.721 "name": "BaseBdev1", 00:18:13.721 "uuid": "423f1ae8-2be0-434e-8bea-b4f38bf5e84f", 00:18:13.721 "is_configured": true, 00:18:13.721 "data_offset": 2048, 00:18:13.721 "data_size": 63488 00:18:13.721 }, 00:18:13.721 { 00:18:13.721 "name": "BaseBdev2", 00:18:13.721 "uuid": "47c97ac2-94b0-40ee-a7e4-13c7450fabd2", 00:18:13.721 "is_configured": true, 00:18:13.721 "data_offset": 2048, 00:18:13.721 "data_size": 63488 00:18:13.721 }, 00:18:13.721 { 00:18:13.721 "name": "BaseBdev3", 00:18:13.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.721 "is_configured": false, 00:18:13.721 "data_offset": 0, 00:18:13.721 "data_size": 0 00:18:13.721 }, 00:18:13.721 { 00:18:13.721 "name": "BaseBdev4", 00:18:13.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.721 "is_configured": false, 00:18:13.721 "data_offset": 0, 00:18:13.721 "data_size": 0 00:18:13.721 } 00:18:13.721 ] 00:18:13.721 }' 00:18:13.721 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.721 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:14.288 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:14.547 [2024-07-12 13:44:02.900143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:14.547 BaseBdev3 00:18:14.547 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:14.547 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:14.547 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:14.547 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:14.547 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:14.548 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:14.548 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:14.806 13:44:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:15.064 [ 00:18:15.064 { 00:18:15.064 "name": "BaseBdev3", 00:18:15.064 "aliases": [ 00:18:15.064 "ca137606-c2ae-4288-8bb3-c22a8445382c" 00:18:15.064 ], 00:18:15.064 "product_name": "Malloc disk", 00:18:15.064 "block_size": 512, 00:18:15.064 "num_blocks": 65536, 00:18:15.064 "uuid": "ca137606-c2ae-4288-8bb3-c22a8445382c", 00:18:15.064 "assigned_rate_limits": { 00:18:15.064 "rw_ios_per_sec": 0, 00:18:15.064 "rw_mbytes_per_sec": 0, 00:18:15.064 "r_mbytes_per_sec": 0, 00:18:15.064 "w_mbytes_per_sec": 0 00:18:15.064 }, 00:18:15.064 "claimed": true, 00:18:15.064 "claim_type": "exclusive_write", 00:18:15.064 "zoned": false, 00:18:15.064 "supported_io_types": { 00:18:15.064 "read": true, 00:18:15.064 "write": true, 00:18:15.064 "unmap": true, 00:18:15.064 "flush": true, 00:18:15.064 "reset": true, 00:18:15.064 "nvme_admin": false, 00:18:15.064 "nvme_io": false, 00:18:15.064 "nvme_io_md": false, 00:18:15.064 "write_zeroes": true, 00:18:15.064 "zcopy": true, 00:18:15.064 "get_zone_info": false, 00:18:15.064 "zone_management": false, 00:18:15.064 "zone_append": false, 00:18:15.064 "compare": false, 00:18:15.064 "compare_and_write": false, 00:18:15.064 "abort": true, 00:18:15.064 "seek_hole": false, 00:18:15.064 "seek_data": false, 00:18:15.064 "copy": true, 00:18:15.064 "nvme_iov_md": false 00:18:15.064 }, 00:18:15.064 "memory_domains": [ 00:18:15.064 { 00:18:15.064 "dma_device_id": "system", 00:18:15.064 "dma_device_type": 1 00:18:15.064 }, 00:18:15.064 { 00:18:15.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.064 "dma_device_type": 2 00:18:15.064 } 00:18:15.064 ], 00:18:15.064 "driver_specific": {} 00:18:15.064 } 00:18:15.064 ] 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.064 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:15.323 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.323 "name": "Existed_Raid", 00:18:15.323 "uuid": "7305f669-3754-4546-92af-205b0c59880e", 00:18:15.323 "strip_size_kb": 64, 00:18:15.323 "state": "configuring", 00:18:15.323 "raid_level": "raid0", 00:18:15.323 "superblock": true, 00:18:15.324 "num_base_bdevs": 4, 00:18:15.324 "num_base_bdevs_discovered": 3, 00:18:15.324 "num_base_bdevs_operational": 4, 00:18:15.324 "base_bdevs_list": [ 00:18:15.324 { 00:18:15.324 "name": "BaseBdev1", 00:18:15.324 "uuid": "423f1ae8-2be0-434e-8bea-b4f38bf5e84f", 00:18:15.324 "is_configured": true, 00:18:15.324 "data_offset": 2048, 00:18:15.324 "data_size": 63488 00:18:15.324 }, 00:18:15.324 { 00:18:15.324 "name": "BaseBdev2", 00:18:15.324 "uuid": "47c97ac2-94b0-40ee-a7e4-13c7450fabd2", 00:18:15.324 "is_configured": true, 00:18:15.324 "data_offset": 2048, 00:18:15.324 "data_size": 63488 00:18:15.324 }, 00:18:15.324 { 00:18:15.324 "name": "BaseBdev3", 00:18:15.324 "uuid": "ca137606-c2ae-4288-8bb3-c22a8445382c", 00:18:15.324 "is_configured": true, 00:18:15.324 "data_offset": 2048, 00:18:15.324 "data_size": 63488 00:18:15.324 }, 00:18:15.324 { 00:18:15.324 "name": "BaseBdev4", 00:18:15.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:15.324 "is_configured": false, 00:18:15.324 "data_offset": 0, 00:18:15.324 "data_size": 0 00:18:15.324 } 00:18:15.324 ] 00:18:15.324 }' 00:18:15.324 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.324 13:44:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:15.892 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:16.151 [2024-07-12 13:44:04.509108] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:16.151 [2024-07-12 13:44:04.509299] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x198ac40 00:18:16.151 [2024-07-12 13:44:04.509314] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:16.151 [2024-07-12 13:44:04.509493] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x198b8c0 00:18:16.151 [2024-07-12 13:44:04.509620] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x198ac40 00:18:16.151 [2024-07-12 13:44:04.509630] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x198ac40 00:18:16.151 [2024-07-12 13:44:04.509729] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:16.151 BaseBdev4 00:18:16.151 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:16.151 13:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:16.151 13:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:16.151 13:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:16.151 13:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:16.151 13:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:16.151 13:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:16.410 13:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:16.669 [ 00:18:16.669 { 00:18:16.669 "name": "BaseBdev4", 00:18:16.669 "aliases": [ 00:18:16.669 "eb250b73-275a-4c71-ba42-30917646bb76" 00:18:16.669 ], 00:18:16.669 "product_name": "Malloc disk", 00:18:16.669 "block_size": 512, 00:18:16.669 "num_blocks": 65536, 00:18:16.669 "uuid": "eb250b73-275a-4c71-ba42-30917646bb76", 00:18:16.669 "assigned_rate_limits": { 00:18:16.669 "rw_ios_per_sec": 0, 00:18:16.669 "rw_mbytes_per_sec": 0, 00:18:16.669 "r_mbytes_per_sec": 0, 00:18:16.669 "w_mbytes_per_sec": 0 00:18:16.669 }, 00:18:16.669 "claimed": true, 00:18:16.669 "claim_type": "exclusive_write", 00:18:16.669 "zoned": false, 00:18:16.669 "supported_io_types": { 00:18:16.669 "read": true, 00:18:16.669 "write": true, 00:18:16.669 "unmap": true, 00:18:16.669 "flush": true, 00:18:16.669 "reset": true, 00:18:16.669 "nvme_admin": false, 00:18:16.669 "nvme_io": false, 00:18:16.669 "nvme_io_md": false, 00:18:16.669 "write_zeroes": true, 00:18:16.669 "zcopy": true, 00:18:16.669 "get_zone_info": false, 00:18:16.669 "zone_management": false, 00:18:16.669 "zone_append": false, 00:18:16.669 "compare": false, 00:18:16.669 "compare_and_write": false, 00:18:16.669 "abort": true, 00:18:16.669 "seek_hole": false, 00:18:16.669 "seek_data": false, 00:18:16.669 "copy": true, 00:18:16.669 "nvme_iov_md": false 00:18:16.669 }, 00:18:16.669 "memory_domains": [ 00:18:16.669 { 00:18:16.669 "dma_device_id": "system", 00:18:16.669 "dma_device_type": 1 00:18:16.669 }, 00:18:16.669 { 00:18:16.669 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.669 "dma_device_type": 2 00:18:16.669 } 00:18:16.669 ], 00:18:16.669 "driver_specific": {} 00:18:16.669 } 00:18:16.669 ] 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.669 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:16.929 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:16.929 "name": "Existed_Raid", 00:18:16.929 "uuid": "7305f669-3754-4546-92af-205b0c59880e", 00:18:16.929 "strip_size_kb": 64, 00:18:16.929 "state": "online", 00:18:16.929 "raid_level": "raid0", 00:18:16.929 "superblock": true, 00:18:16.929 "num_base_bdevs": 4, 00:18:16.929 "num_base_bdevs_discovered": 4, 00:18:16.929 "num_base_bdevs_operational": 4, 00:18:16.929 "base_bdevs_list": [ 00:18:16.929 { 00:18:16.929 "name": "BaseBdev1", 00:18:16.929 "uuid": "423f1ae8-2be0-434e-8bea-b4f38bf5e84f", 00:18:16.929 "is_configured": true, 00:18:16.929 "data_offset": 2048, 00:18:16.929 "data_size": 63488 00:18:16.929 }, 00:18:16.929 { 00:18:16.929 "name": "BaseBdev2", 00:18:16.929 "uuid": "47c97ac2-94b0-40ee-a7e4-13c7450fabd2", 00:18:16.929 "is_configured": true, 00:18:16.929 "data_offset": 2048, 00:18:16.929 "data_size": 63488 00:18:16.929 }, 00:18:16.929 { 00:18:16.929 "name": "BaseBdev3", 00:18:16.929 "uuid": "ca137606-c2ae-4288-8bb3-c22a8445382c", 00:18:16.929 "is_configured": true, 00:18:16.929 "data_offset": 2048, 00:18:16.929 "data_size": 63488 00:18:16.929 }, 00:18:16.929 { 00:18:16.929 "name": "BaseBdev4", 00:18:16.929 "uuid": "eb250b73-275a-4c71-ba42-30917646bb76", 00:18:16.929 "is_configured": true, 00:18:16.929 "data_offset": 2048, 00:18:16.929 "data_size": 63488 00:18:16.929 } 00:18:16.929 ] 00:18:16.929 }' 00:18:16.929 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:16.929 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:17.497 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:17.497 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:17.497 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:17.497 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:17.497 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:17.497 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:17.497 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:17.497 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:17.757 [2024-07-12 13:44:06.121746] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:17.757 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:17.757 "name": "Existed_Raid", 00:18:17.757 "aliases": [ 00:18:17.757 "7305f669-3754-4546-92af-205b0c59880e" 00:18:17.757 ], 00:18:17.757 "product_name": "Raid Volume", 00:18:17.757 "block_size": 512, 00:18:17.757 "num_blocks": 253952, 00:18:17.757 "uuid": "7305f669-3754-4546-92af-205b0c59880e", 00:18:17.757 "assigned_rate_limits": { 00:18:17.757 "rw_ios_per_sec": 0, 00:18:17.757 "rw_mbytes_per_sec": 0, 00:18:17.757 "r_mbytes_per_sec": 0, 00:18:17.757 "w_mbytes_per_sec": 0 00:18:17.757 }, 00:18:17.757 "claimed": false, 00:18:17.757 "zoned": false, 00:18:17.757 "supported_io_types": { 00:18:17.757 "read": true, 00:18:17.757 "write": true, 00:18:17.757 "unmap": true, 00:18:17.757 "flush": true, 00:18:17.757 "reset": true, 00:18:17.757 "nvme_admin": false, 00:18:17.757 "nvme_io": false, 00:18:17.757 "nvme_io_md": false, 00:18:17.757 "write_zeroes": true, 00:18:17.757 "zcopy": false, 00:18:17.757 "get_zone_info": false, 00:18:17.757 "zone_management": false, 00:18:17.757 "zone_append": false, 00:18:17.757 "compare": false, 00:18:17.757 "compare_and_write": false, 00:18:17.757 "abort": false, 00:18:17.757 "seek_hole": false, 00:18:17.757 "seek_data": false, 00:18:17.757 "copy": false, 00:18:17.757 "nvme_iov_md": false 00:18:17.757 }, 00:18:17.757 "memory_domains": [ 00:18:17.757 { 00:18:17.757 "dma_device_id": "system", 00:18:17.757 "dma_device_type": 1 00:18:17.757 }, 00:18:17.757 { 00:18:17.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.757 "dma_device_type": 2 00:18:17.757 }, 00:18:17.757 { 00:18:17.757 "dma_device_id": "system", 00:18:17.757 "dma_device_type": 1 00:18:17.757 }, 00:18:17.757 { 00:18:17.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.757 "dma_device_type": 2 00:18:17.757 }, 00:18:17.757 { 00:18:17.757 "dma_device_id": "system", 00:18:17.757 "dma_device_type": 1 00:18:17.757 }, 00:18:17.757 { 00:18:17.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.757 "dma_device_type": 2 00:18:17.757 }, 00:18:17.757 { 00:18:17.757 "dma_device_id": "system", 00:18:17.757 "dma_device_type": 1 00:18:17.757 }, 00:18:17.757 { 00:18:17.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.757 "dma_device_type": 2 00:18:17.757 } 00:18:17.757 ], 00:18:17.757 "driver_specific": { 00:18:17.757 "raid": { 00:18:17.757 "uuid": "7305f669-3754-4546-92af-205b0c59880e", 00:18:17.757 "strip_size_kb": 64, 00:18:17.757 "state": "online", 00:18:17.757 "raid_level": "raid0", 00:18:17.757 "superblock": true, 00:18:17.757 "num_base_bdevs": 4, 00:18:17.757 "num_base_bdevs_discovered": 4, 00:18:17.757 "num_base_bdevs_operational": 4, 00:18:17.757 "base_bdevs_list": [ 00:18:17.757 { 00:18:17.757 "name": "BaseBdev1", 00:18:17.757 "uuid": "423f1ae8-2be0-434e-8bea-b4f38bf5e84f", 00:18:17.757 "is_configured": true, 00:18:17.757 "data_offset": 2048, 00:18:17.757 "data_size": 63488 00:18:17.757 }, 00:18:17.757 { 00:18:17.757 "name": "BaseBdev2", 00:18:17.757 "uuid": "47c97ac2-94b0-40ee-a7e4-13c7450fabd2", 00:18:17.757 "is_configured": true, 00:18:17.757 "data_offset": 2048, 00:18:17.757 "data_size": 63488 00:18:17.757 }, 00:18:17.757 { 00:18:17.757 "name": "BaseBdev3", 00:18:17.757 "uuid": "ca137606-c2ae-4288-8bb3-c22a8445382c", 00:18:17.757 "is_configured": true, 00:18:17.757 "data_offset": 2048, 00:18:17.757 "data_size": 63488 00:18:17.757 }, 00:18:17.757 { 00:18:17.757 "name": "BaseBdev4", 00:18:17.757 "uuid": "eb250b73-275a-4c71-ba42-30917646bb76", 00:18:17.757 "is_configured": true, 00:18:17.757 "data_offset": 2048, 00:18:17.757 "data_size": 63488 00:18:17.757 } 00:18:17.757 ] 00:18:17.757 } 00:18:17.757 } 00:18:17.757 }' 00:18:17.757 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:17.757 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:17.757 BaseBdev2 00:18:17.757 BaseBdev3 00:18:17.757 BaseBdev4' 00:18:17.757 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.757 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:17.757 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:18.016 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:18.016 "name": "BaseBdev1", 00:18:18.016 "aliases": [ 00:18:18.016 "423f1ae8-2be0-434e-8bea-b4f38bf5e84f" 00:18:18.016 ], 00:18:18.016 "product_name": "Malloc disk", 00:18:18.016 "block_size": 512, 00:18:18.016 "num_blocks": 65536, 00:18:18.016 "uuid": "423f1ae8-2be0-434e-8bea-b4f38bf5e84f", 00:18:18.016 "assigned_rate_limits": { 00:18:18.016 "rw_ios_per_sec": 0, 00:18:18.016 "rw_mbytes_per_sec": 0, 00:18:18.016 "r_mbytes_per_sec": 0, 00:18:18.016 "w_mbytes_per_sec": 0 00:18:18.016 }, 00:18:18.016 "claimed": true, 00:18:18.016 "claim_type": "exclusive_write", 00:18:18.016 "zoned": false, 00:18:18.016 "supported_io_types": { 00:18:18.016 "read": true, 00:18:18.016 "write": true, 00:18:18.016 "unmap": true, 00:18:18.016 "flush": true, 00:18:18.016 "reset": true, 00:18:18.016 "nvme_admin": false, 00:18:18.016 "nvme_io": false, 00:18:18.016 "nvme_io_md": false, 00:18:18.016 "write_zeroes": true, 00:18:18.016 "zcopy": true, 00:18:18.016 "get_zone_info": false, 00:18:18.016 "zone_management": false, 00:18:18.016 "zone_append": false, 00:18:18.016 "compare": false, 00:18:18.016 "compare_and_write": false, 00:18:18.016 "abort": true, 00:18:18.016 "seek_hole": false, 00:18:18.016 "seek_data": false, 00:18:18.016 "copy": true, 00:18:18.016 "nvme_iov_md": false 00:18:18.016 }, 00:18:18.016 "memory_domains": [ 00:18:18.016 { 00:18:18.016 "dma_device_id": "system", 00:18:18.016 "dma_device_type": 1 00:18:18.016 }, 00:18:18.016 { 00:18:18.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.016 "dma_device_type": 2 00:18:18.016 } 00:18:18.016 ], 00:18:18.016 "driver_specific": {} 00:18:18.016 }' 00:18:18.016 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.016 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.016 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.016 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.016 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.275 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.275 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.275 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.275 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.275 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.275 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.275 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.275 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:18.275 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:18.275 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:18.534 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:18.534 "name": "BaseBdev2", 00:18:18.534 "aliases": [ 00:18:18.534 "47c97ac2-94b0-40ee-a7e4-13c7450fabd2" 00:18:18.534 ], 00:18:18.534 "product_name": "Malloc disk", 00:18:18.534 "block_size": 512, 00:18:18.534 "num_blocks": 65536, 00:18:18.535 "uuid": "47c97ac2-94b0-40ee-a7e4-13c7450fabd2", 00:18:18.535 "assigned_rate_limits": { 00:18:18.535 "rw_ios_per_sec": 0, 00:18:18.535 "rw_mbytes_per_sec": 0, 00:18:18.535 "r_mbytes_per_sec": 0, 00:18:18.535 "w_mbytes_per_sec": 0 00:18:18.535 }, 00:18:18.535 "claimed": true, 00:18:18.535 "claim_type": "exclusive_write", 00:18:18.535 "zoned": false, 00:18:18.535 "supported_io_types": { 00:18:18.535 "read": true, 00:18:18.535 "write": true, 00:18:18.535 "unmap": true, 00:18:18.535 "flush": true, 00:18:18.535 "reset": true, 00:18:18.535 "nvme_admin": false, 00:18:18.535 "nvme_io": false, 00:18:18.535 "nvme_io_md": false, 00:18:18.535 "write_zeroes": true, 00:18:18.535 "zcopy": true, 00:18:18.535 "get_zone_info": false, 00:18:18.535 "zone_management": false, 00:18:18.535 "zone_append": false, 00:18:18.535 "compare": false, 00:18:18.535 "compare_and_write": false, 00:18:18.535 "abort": true, 00:18:18.535 "seek_hole": false, 00:18:18.535 "seek_data": false, 00:18:18.535 "copy": true, 00:18:18.535 "nvme_iov_md": false 00:18:18.535 }, 00:18:18.535 "memory_domains": [ 00:18:18.535 { 00:18:18.535 "dma_device_id": "system", 00:18:18.535 "dma_device_type": 1 00:18:18.535 }, 00:18:18.535 { 00:18:18.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.535 "dma_device_type": 2 00:18:18.535 } 00:18:18.535 ], 00:18:18.535 "driver_specific": {} 00:18:18.535 }' 00:18:18.535 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.535 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.794 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.794 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.794 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.794 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.794 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.794 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.794 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.794 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.794 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.053 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.053 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:19.053 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:19.053 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:19.312 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:19.312 "name": "BaseBdev3", 00:18:19.312 "aliases": [ 00:18:19.312 "ca137606-c2ae-4288-8bb3-c22a8445382c" 00:18:19.312 ], 00:18:19.312 "product_name": "Malloc disk", 00:18:19.312 "block_size": 512, 00:18:19.312 "num_blocks": 65536, 00:18:19.312 "uuid": "ca137606-c2ae-4288-8bb3-c22a8445382c", 00:18:19.312 "assigned_rate_limits": { 00:18:19.312 "rw_ios_per_sec": 0, 00:18:19.312 "rw_mbytes_per_sec": 0, 00:18:19.312 "r_mbytes_per_sec": 0, 00:18:19.312 "w_mbytes_per_sec": 0 00:18:19.312 }, 00:18:19.312 "claimed": true, 00:18:19.312 "claim_type": "exclusive_write", 00:18:19.312 "zoned": false, 00:18:19.312 "supported_io_types": { 00:18:19.312 "read": true, 00:18:19.312 "write": true, 00:18:19.312 "unmap": true, 00:18:19.313 "flush": true, 00:18:19.313 "reset": true, 00:18:19.313 "nvme_admin": false, 00:18:19.313 "nvme_io": false, 00:18:19.313 "nvme_io_md": false, 00:18:19.313 "write_zeroes": true, 00:18:19.313 "zcopy": true, 00:18:19.313 "get_zone_info": false, 00:18:19.313 "zone_management": false, 00:18:19.313 "zone_append": false, 00:18:19.313 "compare": false, 00:18:19.313 "compare_and_write": false, 00:18:19.313 "abort": true, 00:18:19.313 "seek_hole": false, 00:18:19.313 "seek_data": false, 00:18:19.313 "copy": true, 00:18:19.313 "nvme_iov_md": false 00:18:19.313 }, 00:18:19.313 "memory_domains": [ 00:18:19.313 { 00:18:19.313 "dma_device_id": "system", 00:18:19.313 "dma_device_type": 1 00:18:19.313 }, 00:18:19.313 { 00:18:19.313 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.313 "dma_device_type": 2 00:18:19.313 } 00:18:19.313 ], 00:18:19.313 "driver_specific": {} 00:18:19.313 }' 00:18:19.313 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.313 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.313 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:19.313 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.313 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.313 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:19.313 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.313 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.571 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:19.571 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.571 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.571 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.571 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:19.571 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:19.571 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:19.830 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:19.830 "name": "BaseBdev4", 00:18:19.830 "aliases": [ 00:18:19.830 "eb250b73-275a-4c71-ba42-30917646bb76" 00:18:19.830 ], 00:18:19.830 "product_name": "Malloc disk", 00:18:19.830 "block_size": 512, 00:18:19.830 "num_blocks": 65536, 00:18:19.830 "uuid": "eb250b73-275a-4c71-ba42-30917646bb76", 00:18:19.830 "assigned_rate_limits": { 00:18:19.830 "rw_ios_per_sec": 0, 00:18:19.830 "rw_mbytes_per_sec": 0, 00:18:19.830 "r_mbytes_per_sec": 0, 00:18:19.830 "w_mbytes_per_sec": 0 00:18:19.830 }, 00:18:19.830 "claimed": true, 00:18:19.830 "claim_type": "exclusive_write", 00:18:19.830 "zoned": false, 00:18:19.830 "supported_io_types": { 00:18:19.830 "read": true, 00:18:19.830 "write": true, 00:18:19.830 "unmap": true, 00:18:19.830 "flush": true, 00:18:19.830 "reset": true, 00:18:19.830 "nvme_admin": false, 00:18:19.830 "nvme_io": false, 00:18:19.830 "nvme_io_md": false, 00:18:19.830 "write_zeroes": true, 00:18:19.830 "zcopy": true, 00:18:19.830 "get_zone_info": false, 00:18:19.830 "zone_management": false, 00:18:19.830 "zone_append": false, 00:18:19.830 "compare": false, 00:18:19.830 "compare_and_write": false, 00:18:19.830 "abort": true, 00:18:19.830 "seek_hole": false, 00:18:19.830 "seek_data": false, 00:18:19.830 "copy": true, 00:18:19.830 "nvme_iov_md": false 00:18:19.830 }, 00:18:19.830 "memory_domains": [ 00:18:19.830 { 00:18:19.830 "dma_device_id": "system", 00:18:19.830 "dma_device_type": 1 00:18:19.830 }, 00:18:19.830 { 00:18:19.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.830 "dma_device_type": 2 00:18:19.830 } 00:18:19.830 ], 00:18:19.830 "driver_specific": {} 00:18:19.830 }' 00:18:19.830 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.830 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.830 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:19.830 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.830 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.089 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:20.089 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.089 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.089 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:20.089 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.089 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.089 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:20.089 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:20.350 [2024-07-12 13:44:08.820626] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:20.350 [2024-07-12 13:44:08.820660] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:20.350 [2024-07-12 13:44:08.820711] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:20.350 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.609 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.609 "name": "Existed_Raid", 00:18:20.609 "uuid": "7305f669-3754-4546-92af-205b0c59880e", 00:18:20.609 "strip_size_kb": 64, 00:18:20.609 "state": "offline", 00:18:20.609 "raid_level": "raid0", 00:18:20.609 "superblock": true, 00:18:20.609 "num_base_bdevs": 4, 00:18:20.609 "num_base_bdevs_discovered": 3, 00:18:20.609 "num_base_bdevs_operational": 3, 00:18:20.609 "base_bdevs_list": [ 00:18:20.609 { 00:18:20.609 "name": null, 00:18:20.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.609 "is_configured": false, 00:18:20.609 "data_offset": 2048, 00:18:20.609 "data_size": 63488 00:18:20.609 }, 00:18:20.609 { 00:18:20.609 "name": "BaseBdev2", 00:18:20.609 "uuid": "47c97ac2-94b0-40ee-a7e4-13c7450fabd2", 00:18:20.609 "is_configured": true, 00:18:20.609 "data_offset": 2048, 00:18:20.609 "data_size": 63488 00:18:20.609 }, 00:18:20.609 { 00:18:20.609 "name": "BaseBdev3", 00:18:20.609 "uuid": "ca137606-c2ae-4288-8bb3-c22a8445382c", 00:18:20.609 "is_configured": true, 00:18:20.609 "data_offset": 2048, 00:18:20.609 "data_size": 63488 00:18:20.609 }, 00:18:20.609 { 00:18:20.609 "name": "BaseBdev4", 00:18:20.609 "uuid": "eb250b73-275a-4c71-ba42-30917646bb76", 00:18:20.609 "is_configured": true, 00:18:20.609 "data_offset": 2048, 00:18:20.609 "data_size": 63488 00:18:20.609 } 00:18:20.609 ] 00:18:20.609 }' 00:18:20.609 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.609 13:44:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:21.177 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:21.177 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:21.177 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.177 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:21.437 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:21.437 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:21.437 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:21.697 [2024-07-12 13:44:10.202361] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:21.697 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:21.697 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:21.697 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.697 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:21.956 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:21.956 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:21.956 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:22.215 [2024-07-12 13:44:10.704077] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:22.215 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:22.215 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:22.215 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.215 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:22.474 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:22.474 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:22.474 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:22.733 [2024-07-12 13:44:11.257985] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:22.733 [2024-07-12 13:44:11.258029] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x198ac40 name Existed_Raid, state offline 00:18:22.733 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:22.733 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:22.733 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.733 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:22.992 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:22.992 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:22.992 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:22.992 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:22.992 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:22.992 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:23.252 BaseBdev2 00:18:23.252 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:23.252 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:23.252 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:23.252 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:23.252 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:23.252 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:23.252 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:23.511 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:23.770 [ 00:18:23.770 { 00:18:23.770 "name": "BaseBdev2", 00:18:23.770 "aliases": [ 00:18:23.770 "3be16ef8-16d8-47b8-98f4-661e312ad37c" 00:18:23.770 ], 00:18:23.770 "product_name": "Malloc disk", 00:18:23.770 "block_size": 512, 00:18:23.770 "num_blocks": 65536, 00:18:23.770 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:23.770 "assigned_rate_limits": { 00:18:23.770 "rw_ios_per_sec": 0, 00:18:23.770 "rw_mbytes_per_sec": 0, 00:18:23.770 "r_mbytes_per_sec": 0, 00:18:23.770 "w_mbytes_per_sec": 0 00:18:23.770 }, 00:18:23.770 "claimed": false, 00:18:23.770 "zoned": false, 00:18:23.770 "supported_io_types": { 00:18:23.770 "read": true, 00:18:23.770 "write": true, 00:18:23.770 "unmap": true, 00:18:23.770 "flush": true, 00:18:23.770 "reset": true, 00:18:23.770 "nvme_admin": false, 00:18:23.770 "nvme_io": false, 00:18:23.770 "nvme_io_md": false, 00:18:23.770 "write_zeroes": true, 00:18:23.770 "zcopy": true, 00:18:23.770 "get_zone_info": false, 00:18:23.770 "zone_management": false, 00:18:23.770 "zone_append": false, 00:18:23.770 "compare": false, 00:18:23.770 "compare_and_write": false, 00:18:23.770 "abort": true, 00:18:23.770 "seek_hole": false, 00:18:23.770 "seek_data": false, 00:18:23.770 "copy": true, 00:18:23.770 "nvme_iov_md": false 00:18:23.770 }, 00:18:23.770 "memory_domains": [ 00:18:23.770 { 00:18:23.770 "dma_device_id": "system", 00:18:23.770 "dma_device_type": 1 00:18:23.770 }, 00:18:23.770 { 00:18:23.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.770 "dma_device_type": 2 00:18:23.770 } 00:18:23.770 ], 00:18:23.770 "driver_specific": {} 00:18:23.770 } 00:18:23.770 ] 00:18:23.770 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:23.770 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:23.770 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:23.770 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:24.028 BaseBdev3 00:18:24.028 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:24.028 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:24.028 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:24.028 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:24.028 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:24.028 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:24.028 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:24.286 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:24.545 [ 00:18:24.545 { 00:18:24.545 "name": "BaseBdev3", 00:18:24.545 "aliases": [ 00:18:24.545 "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3" 00:18:24.545 ], 00:18:24.545 "product_name": "Malloc disk", 00:18:24.545 "block_size": 512, 00:18:24.545 "num_blocks": 65536, 00:18:24.545 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:24.545 "assigned_rate_limits": { 00:18:24.545 "rw_ios_per_sec": 0, 00:18:24.545 "rw_mbytes_per_sec": 0, 00:18:24.545 "r_mbytes_per_sec": 0, 00:18:24.545 "w_mbytes_per_sec": 0 00:18:24.545 }, 00:18:24.545 "claimed": false, 00:18:24.545 "zoned": false, 00:18:24.545 "supported_io_types": { 00:18:24.545 "read": true, 00:18:24.545 "write": true, 00:18:24.545 "unmap": true, 00:18:24.545 "flush": true, 00:18:24.545 "reset": true, 00:18:24.545 "nvme_admin": false, 00:18:24.545 "nvme_io": false, 00:18:24.545 "nvme_io_md": false, 00:18:24.545 "write_zeroes": true, 00:18:24.545 "zcopy": true, 00:18:24.545 "get_zone_info": false, 00:18:24.545 "zone_management": false, 00:18:24.545 "zone_append": false, 00:18:24.545 "compare": false, 00:18:24.545 "compare_and_write": false, 00:18:24.545 "abort": true, 00:18:24.545 "seek_hole": false, 00:18:24.545 "seek_data": false, 00:18:24.545 "copy": true, 00:18:24.545 "nvme_iov_md": false 00:18:24.545 }, 00:18:24.545 "memory_domains": [ 00:18:24.545 { 00:18:24.545 "dma_device_id": "system", 00:18:24.545 "dma_device_type": 1 00:18:24.545 }, 00:18:24.545 { 00:18:24.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.545 "dma_device_type": 2 00:18:24.545 } 00:18:24.545 ], 00:18:24.545 "driver_specific": {} 00:18:24.545 } 00:18:24.545 ] 00:18:24.545 13:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:24.545 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:24.545 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:24.545 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:24.803 BaseBdev4 00:18:24.803 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:24.803 13:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:24.803 13:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:24.803 13:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:24.803 13:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:24.803 13:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:24.803 13:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:25.061 13:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:25.320 [ 00:18:25.320 { 00:18:25.320 "name": "BaseBdev4", 00:18:25.320 "aliases": [ 00:18:25.320 "66f7d511-aea7-457e-84a4-ce1525b047b1" 00:18:25.320 ], 00:18:25.320 "product_name": "Malloc disk", 00:18:25.320 "block_size": 512, 00:18:25.320 "num_blocks": 65536, 00:18:25.320 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:25.320 "assigned_rate_limits": { 00:18:25.320 "rw_ios_per_sec": 0, 00:18:25.320 "rw_mbytes_per_sec": 0, 00:18:25.320 "r_mbytes_per_sec": 0, 00:18:25.320 "w_mbytes_per_sec": 0 00:18:25.320 }, 00:18:25.320 "claimed": false, 00:18:25.320 "zoned": false, 00:18:25.320 "supported_io_types": { 00:18:25.320 "read": true, 00:18:25.320 "write": true, 00:18:25.320 "unmap": true, 00:18:25.320 "flush": true, 00:18:25.320 "reset": true, 00:18:25.320 "nvme_admin": false, 00:18:25.320 "nvme_io": false, 00:18:25.320 "nvme_io_md": false, 00:18:25.320 "write_zeroes": true, 00:18:25.320 "zcopy": true, 00:18:25.320 "get_zone_info": false, 00:18:25.320 "zone_management": false, 00:18:25.320 "zone_append": false, 00:18:25.320 "compare": false, 00:18:25.320 "compare_and_write": false, 00:18:25.320 "abort": true, 00:18:25.320 "seek_hole": false, 00:18:25.320 "seek_data": false, 00:18:25.320 "copy": true, 00:18:25.320 "nvme_iov_md": false 00:18:25.320 }, 00:18:25.320 "memory_domains": [ 00:18:25.320 { 00:18:25.320 "dma_device_id": "system", 00:18:25.320 "dma_device_type": 1 00:18:25.320 }, 00:18:25.320 { 00:18:25.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.320 "dma_device_type": 2 00:18:25.320 } 00:18:25.320 ], 00:18:25.320 "driver_specific": {} 00:18:25.320 } 00:18:25.320 ] 00:18:25.320 13:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:25.320 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:25.320 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:25.320 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:25.579 [2024-07-12 13:44:14.014916] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:25.579 [2024-07-12 13:44:14.014967] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:25.579 [2024-07-12 13:44:14.014995] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:25.579 [2024-07-12 13:44:14.016379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:25.579 [2024-07-12 13:44:14.016423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.579 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.838 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.838 "name": "Existed_Raid", 00:18:25.838 "uuid": "5c75330a-7d27-4b37-994d-3c637645f787", 00:18:25.838 "strip_size_kb": 64, 00:18:25.838 "state": "configuring", 00:18:25.838 "raid_level": "raid0", 00:18:25.838 "superblock": true, 00:18:25.838 "num_base_bdevs": 4, 00:18:25.838 "num_base_bdevs_discovered": 3, 00:18:25.838 "num_base_bdevs_operational": 4, 00:18:25.838 "base_bdevs_list": [ 00:18:25.838 { 00:18:25.838 "name": "BaseBdev1", 00:18:25.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.838 "is_configured": false, 00:18:25.838 "data_offset": 0, 00:18:25.838 "data_size": 0 00:18:25.838 }, 00:18:25.838 { 00:18:25.838 "name": "BaseBdev2", 00:18:25.838 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:25.838 "is_configured": true, 00:18:25.838 "data_offset": 2048, 00:18:25.838 "data_size": 63488 00:18:25.838 }, 00:18:25.838 { 00:18:25.838 "name": "BaseBdev3", 00:18:25.838 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:25.838 "is_configured": true, 00:18:25.838 "data_offset": 2048, 00:18:25.838 "data_size": 63488 00:18:25.838 }, 00:18:25.838 { 00:18:25.838 "name": "BaseBdev4", 00:18:25.838 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:25.838 "is_configured": true, 00:18:25.838 "data_offset": 2048, 00:18:25.838 "data_size": 63488 00:18:25.838 } 00:18:25.838 ] 00:18:25.838 }' 00:18:25.838 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.838 13:44:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:26.775 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:27.034 [2024-07-12 13:44:15.470773] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.034 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.293 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.293 "name": "Existed_Raid", 00:18:27.293 "uuid": "5c75330a-7d27-4b37-994d-3c637645f787", 00:18:27.293 "strip_size_kb": 64, 00:18:27.293 "state": "configuring", 00:18:27.293 "raid_level": "raid0", 00:18:27.293 "superblock": true, 00:18:27.293 "num_base_bdevs": 4, 00:18:27.293 "num_base_bdevs_discovered": 2, 00:18:27.293 "num_base_bdevs_operational": 4, 00:18:27.293 "base_bdevs_list": [ 00:18:27.293 { 00:18:27.293 "name": "BaseBdev1", 00:18:27.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.293 "is_configured": false, 00:18:27.293 "data_offset": 0, 00:18:27.293 "data_size": 0 00:18:27.293 }, 00:18:27.293 { 00:18:27.293 "name": null, 00:18:27.293 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:27.293 "is_configured": false, 00:18:27.293 "data_offset": 2048, 00:18:27.293 "data_size": 63488 00:18:27.293 }, 00:18:27.293 { 00:18:27.293 "name": "BaseBdev3", 00:18:27.293 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:27.293 "is_configured": true, 00:18:27.293 "data_offset": 2048, 00:18:27.293 "data_size": 63488 00:18:27.293 }, 00:18:27.293 { 00:18:27.293 "name": "BaseBdev4", 00:18:27.293 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:27.293 "is_configured": true, 00:18:27.293 "data_offset": 2048, 00:18:27.293 "data_size": 63488 00:18:27.293 } 00:18:27.293 ] 00:18:27.293 }' 00:18:27.293 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.293 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.860 13:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:27.860 13:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.118 13:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:28.118 13:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:28.376 [2024-07-12 13:44:16.883164] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:28.376 BaseBdev1 00:18:28.376 13:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:28.376 13:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:28.376 13:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:28.376 13:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:28.376 13:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:28.376 13:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:28.376 13:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:28.634 13:44:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:28.893 [ 00:18:28.893 { 00:18:28.893 "name": "BaseBdev1", 00:18:28.893 "aliases": [ 00:18:28.893 "ddebcc67-0411-46a2-9488-aebcd985f0a4" 00:18:28.893 ], 00:18:28.893 "product_name": "Malloc disk", 00:18:28.893 "block_size": 512, 00:18:28.893 "num_blocks": 65536, 00:18:28.893 "uuid": "ddebcc67-0411-46a2-9488-aebcd985f0a4", 00:18:28.893 "assigned_rate_limits": { 00:18:28.893 "rw_ios_per_sec": 0, 00:18:28.893 "rw_mbytes_per_sec": 0, 00:18:28.893 "r_mbytes_per_sec": 0, 00:18:28.893 "w_mbytes_per_sec": 0 00:18:28.893 }, 00:18:28.893 "claimed": true, 00:18:28.893 "claim_type": "exclusive_write", 00:18:28.893 "zoned": false, 00:18:28.893 "supported_io_types": { 00:18:28.893 "read": true, 00:18:28.893 "write": true, 00:18:28.893 "unmap": true, 00:18:28.893 "flush": true, 00:18:28.893 "reset": true, 00:18:28.893 "nvme_admin": false, 00:18:28.893 "nvme_io": false, 00:18:28.893 "nvme_io_md": false, 00:18:28.893 "write_zeroes": true, 00:18:28.893 "zcopy": true, 00:18:28.893 "get_zone_info": false, 00:18:28.893 "zone_management": false, 00:18:28.893 "zone_append": false, 00:18:28.893 "compare": false, 00:18:28.893 "compare_and_write": false, 00:18:28.893 "abort": true, 00:18:28.893 "seek_hole": false, 00:18:28.893 "seek_data": false, 00:18:28.893 "copy": true, 00:18:28.893 "nvme_iov_md": false 00:18:28.893 }, 00:18:28.893 "memory_domains": [ 00:18:28.893 { 00:18:28.893 "dma_device_id": "system", 00:18:28.893 "dma_device_type": 1 00:18:28.893 }, 00:18:28.893 { 00:18:28.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.893 "dma_device_type": 2 00:18:28.893 } 00:18:28.893 ], 00:18:28.893 "driver_specific": {} 00:18:28.893 } 00:18:28.893 ] 00:18:28.893 13:44:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:28.893 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:28.893 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:28.893 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:28.893 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:28.894 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:28.894 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:28.894 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.894 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.894 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.894 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.894 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.894 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:29.158 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.158 "name": "Existed_Raid", 00:18:29.158 "uuid": "5c75330a-7d27-4b37-994d-3c637645f787", 00:18:29.158 "strip_size_kb": 64, 00:18:29.158 "state": "configuring", 00:18:29.158 "raid_level": "raid0", 00:18:29.158 "superblock": true, 00:18:29.158 "num_base_bdevs": 4, 00:18:29.158 "num_base_bdevs_discovered": 3, 00:18:29.158 "num_base_bdevs_operational": 4, 00:18:29.158 "base_bdevs_list": [ 00:18:29.158 { 00:18:29.158 "name": "BaseBdev1", 00:18:29.158 "uuid": "ddebcc67-0411-46a2-9488-aebcd985f0a4", 00:18:29.158 "is_configured": true, 00:18:29.158 "data_offset": 2048, 00:18:29.158 "data_size": 63488 00:18:29.158 }, 00:18:29.158 { 00:18:29.158 "name": null, 00:18:29.158 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:29.158 "is_configured": false, 00:18:29.158 "data_offset": 2048, 00:18:29.158 "data_size": 63488 00:18:29.158 }, 00:18:29.158 { 00:18:29.158 "name": "BaseBdev3", 00:18:29.158 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:29.158 "is_configured": true, 00:18:29.158 "data_offset": 2048, 00:18:29.158 "data_size": 63488 00:18:29.158 }, 00:18:29.158 { 00:18:29.158 "name": "BaseBdev4", 00:18:29.158 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:29.158 "is_configured": true, 00:18:29.158 "data_offset": 2048, 00:18:29.158 "data_size": 63488 00:18:29.158 } 00:18:29.158 ] 00:18:29.158 }' 00:18:29.158 13:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.158 13:44:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:29.800 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.800 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:30.079 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:30.079 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:30.369 [2024-07-12 13:44:18.732117] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.369 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.628 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.628 "name": "Existed_Raid", 00:18:30.628 "uuid": "5c75330a-7d27-4b37-994d-3c637645f787", 00:18:30.628 "strip_size_kb": 64, 00:18:30.628 "state": "configuring", 00:18:30.628 "raid_level": "raid0", 00:18:30.628 "superblock": true, 00:18:30.628 "num_base_bdevs": 4, 00:18:30.629 "num_base_bdevs_discovered": 2, 00:18:30.629 "num_base_bdevs_operational": 4, 00:18:30.629 "base_bdevs_list": [ 00:18:30.629 { 00:18:30.629 "name": "BaseBdev1", 00:18:30.629 "uuid": "ddebcc67-0411-46a2-9488-aebcd985f0a4", 00:18:30.629 "is_configured": true, 00:18:30.629 "data_offset": 2048, 00:18:30.629 "data_size": 63488 00:18:30.629 }, 00:18:30.629 { 00:18:30.629 "name": null, 00:18:30.629 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:30.629 "is_configured": false, 00:18:30.629 "data_offset": 2048, 00:18:30.629 "data_size": 63488 00:18:30.629 }, 00:18:30.629 { 00:18:30.629 "name": null, 00:18:30.629 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:30.629 "is_configured": false, 00:18:30.629 "data_offset": 2048, 00:18:30.629 "data_size": 63488 00:18:30.629 }, 00:18:30.629 { 00:18:30.629 "name": "BaseBdev4", 00:18:30.629 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:30.629 "is_configured": true, 00:18:30.629 "data_offset": 2048, 00:18:30.629 "data_size": 63488 00:18:30.629 } 00:18:30.629 ] 00:18:30.629 }' 00:18:30.629 13:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.629 13:44:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:31.195 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:31.195 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.195 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:31.195 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:31.454 [2024-07-12 13:44:19.927404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.454 13:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:31.713 13:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:31.713 "name": "Existed_Raid", 00:18:31.713 "uuid": "5c75330a-7d27-4b37-994d-3c637645f787", 00:18:31.713 "strip_size_kb": 64, 00:18:31.713 "state": "configuring", 00:18:31.713 "raid_level": "raid0", 00:18:31.713 "superblock": true, 00:18:31.713 "num_base_bdevs": 4, 00:18:31.713 "num_base_bdevs_discovered": 3, 00:18:31.713 "num_base_bdevs_operational": 4, 00:18:31.713 "base_bdevs_list": [ 00:18:31.713 { 00:18:31.713 "name": "BaseBdev1", 00:18:31.713 "uuid": "ddebcc67-0411-46a2-9488-aebcd985f0a4", 00:18:31.713 "is_configured": true, 00:18:31.713 "data_offset": 2048, 00:18:31.713 "data_size": 63488 00:18:31.713 }, 00:18:31.713 { 00:18:31.713 "name": null, 00:18:31.713 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:31.713 "is_configured": false, 00:18:31.713 "data_offset": 2048, 00:18:31.713 "data_size": 63488 00:18:31.713 }, 00:18:31.713 { 00:18:31.713 "name": "BaseBdev3", 00:18:31.713 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:31.713 "is_configured": true, 00:18:31.713 "data_offset": 2048, 00:18:31.713 "data_size": 63488 00:18:31.713 }, 00:18:31.713 { 00:18:31.713 "name": "BaseBdev4", 00:18:31.713 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:31.713 "is_configured": true, 00:18:31.713 "data_offset": 2048, 00:18:31.713 "data_size": 63488 00:18:31.713 } 00:18:31.713 ] 00:18:31.713 }' 00:18:31.713 13:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:31.713 13:44:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:32.279 13:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.279 13:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:32.538 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:32.538 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:32.797 [2024-07-12 13:44:21.315112] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.797 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.055 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.055 "name": "Existed_Raid", 00:18:33.055 "uuid": "5c75330a-7d27-4b37-994d-3c637645f787", 00:18:33.055 "strip_size_kb": 64, 00:18:33.055 "state": "configuring", 00:18:33.055 "raid_level": "raid0", 00:18:33.055 "superblock": true, 00:18:33.055 "num_base_bdevs": 4, 00:18:33.055 "num_base_bdevs_discovered": 2, 00:18:33.055 "num_base_bdevs_operational": 4, 00:18:33.055 "base_bdevs_list": [ 00:18:33.055 { 00:18:33.055 "name": null, 00:18:33.055 "uuid": "ddebcc67-0411-46a2-9488-aebcd985f0a4", 00:18:33.055 "is_configured": false, 00:18:33.055 "data_offset": 2048, 00:18:33.055 "data_size": 63488 00:18:33.055 }, 00:18:33.055 { 00:18:33.055 "name": null, 00:18:33.055 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:33.055 "is_configured": false, 00:18:33.055 "data_offset": 2048, 00:18:33.055 "data_size": 63488 00:18:33.055 }, 00:18:33.055 { 00:18:33.055 "name": "BaseBdev3", 00:18:33.055 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:33.055 "is_configured": true, 00:18:33.055 "data_offset": 2048, 00:18:33.055 "data_size": 63488 00:18:33.055 }, 00:18:33.055 { 00:18:33.055 "name": "BaseBdev4", 00:18:33.055 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:33.055 "is_configured": true, 00:18:33.055 "data_offset": 2048, 00:18:33.055 "data_size": 63488 00:18:33.055 } 00:18:33.055 ] 00:18:33.055 }' 00:18:33.055 13:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.055 13:44:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:33.621 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.621 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:33.880 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:33.880 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:34.138 [2024-07-12 13:44:22.674992] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:34.138 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.396 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:34.396 "name": "Existed_Raid", 00:18:34.396 "uuid": "5c75330a-7d27-4b37-994d-3c637645f787", 00:18:34.396 "strip_size_kb": 64, 00:18:34.396 "state": "configuring", 00:18:34.396 "raid_level": "raid0", 00:18:34.396 "superblock": true, 00:18:34.396 "num_base_bdevs": 4, 00:18:34.396 "num_base_bdevs_discovered": 3, 00:18:34.396 "num_base_bdevs_operational": 4, 00:18:34.396 "base_bdevs_list": [ 00:18:34.396 { 00:18:34.396 "name": null, 00:18:34.396 "uuid": "ddebcc67-0411-46a2-9488-aebcd985f0a4", 00:18:34.396 "is_configured": false, 00:18:34.396 "data_offset": 2048, 00:18:34.396 "data_size": 63488 00:18:34.396 }, 00:18:34.396 { 00:18:34.396 "name": "BaseBdev2", 00:18:34.396 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:34.396 "is_configured": true, 00:18:34.396 "data_offset": 2048, 00:18:34.396 "data_size": 63488 00:18:34.396 }, 00:18:34.396 { 00:18:34.396 "name": "BaseBdev3", 00:18:34.396 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:34.396 "is_configured": true, 00:18:34.396 "data_offset": 2048, 00:18:34.396 "data_size": 63488 00:18:34.396 }, 00:18:34.396 { 00:18:34.396 "name": "BaseBdev4", 00:18:34.396 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:34.396 "is_configured": true, 00:18:34.396 "data_offset": 2048, 00:18:34.396 "data_size": 63488 00:18:34.396 } 00:18:34.396 ] 00:18:34.396 }' 00:18:34.396 13:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:34.396 13:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:35.328 13:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.328 13:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:35.328 13:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:35.328 13:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.328 13:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:35.585 13:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ddebcc67-0411-46a2-9488-aebcd985f0a4 00:18:35.844 [2024-07-12 13:44:24.206520] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:35.844 [2024-07-12 13:44:24.206679] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19892d0 00:18:35.844 [2024-07-12 13:44:24.206692] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:35.844 [2024-07-12 13:44:24.206871] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x198f2a0 00:18:35.844 [2024-07-12 13:44:24.207021] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19892d0 00:18:35.844 [2024-07-12 13:44:24.207036] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19892d0 00:18:35.844 [2024-07-12 13:44:24.207140] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:35.844 NewBaseBdev 00:18:35.844 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:35.844 13:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:35.844 13:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:35.844 13:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:35.844 13:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:35.844 13:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:35.844 13:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:36.103 13:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:36.362 [ 00:18:36.362 { 00:18:36.362 "name": "NewBaseBdev", 00:18:36.362 "aliases": [ 00:18:36.362 "ddebcc67-0411-46a2-9488-aebcd985f0a4" 00:18:36.362 ], 00:18:36.362 "product_name": "Malloc disk", 00:18:36.362 "block_size": 512, 00:18:36.362 "num_blocks": 65536, 00:18:36.362 "uuid": "ddebcc67-0411-46a2-9488-aebcd985f0a4", 00:18:36.362 "assigned_rate_limits": { 00:18:36.362 "rw_ios_per_sec": 0, 00:18:36.362 "rw_mbytes_per_sec": 0, 00:18:36.362 "r_mbytes_per_sec": 0, 00:18:36.362 "w_mbytes_per_sec": 0 00:18:36.362 }, 00:18:36.362 "claimed": true, 00:18:36.362 "claim_type": "exclusive_write", 00:18:36.362 "zoned": false, 00:18:36.362 "supported_io_types": { 00:18:36.362 "read": true, 00:18:36.362 "write": true, 00:18:36.362 "unmap": true, 00:18:36.362 "flush": true, 00:18:36.362 "reset": true, 00:18:36.362 "nvme_admin": false, 00:18:36.362 "nvme_io": false, 00:18:36.362 "nvme_io_md": false, 00:18:36.362 "write_zeroes": true, 00:18:36.362 "zcopy": true, 00:18:36.362 "get_zone_info": false, 00:18:36.362 "zone_management": false, 00:18:36.362 "zone_append": false, 00:18:36.362 "compare": false, 00:18:36.362 "compare_and_write": false, 00:18:36.362 "abort": true, 00:18:36.362 "seek_hole": false, 00:18:36.362 "seek_data": false, 00:18:36.362 "copy": true, 00:18:36.362 "nvme_iov_md": false 00:18:36.362 }, 00:18:36.362 "memory_domains": [ 00:18:36.362 { 00:18:36.362 "dma_device_id": "system", 00:18:36.362 "dma_device_type": 1 00:18:36.362 }, 00:18:36.362 { 00:18:36.362 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.362 "dma_device_type": 2 00:18:36.362 } 00:18:36.362 ], 00:18:36.362 "driver_specific": {} 00:18:36.362 } 00:18:36.362 ] 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.362 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:36.622 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:36.622 "name": "Existed_Raid", 00:18:36.622 "uuid": "5c75330a-7d27-4b37-994d-3c637645f787", 00:18:36.622 "strip_size_kb": 64, 00:18:36.622 "state": "online", 00:18:36.622 "raid_level": "raid0", 00:18:36.622 "superblock": true, 00:18:36.622 "num_base_bdevs": 4, 00:18:36.622 "num_base_bdevs_discovered": 4, 00:18:36.622 "num_base_bdevs_operational": 4, 00:18:36.622 "base_bdevs_list": [ 00:18:36.622 { 00:18:36.622 "name": "NewBaseBdev", 00:18:36.622 "uuid": "ddebcc67-0411-46a2-9488-aebcd985f0a4", 00:18:36.622 "is_configured": true, 00:18:36.622 "data_offset": 2048, 00:18:36.622 "data_size": 63488 00:18:36.622 }, 00:18:36.622 { 00:18:36.622 "name": "BaseBdev2", 00:18:36.622 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:36.622 "is_configured": true, 00:18:36.622 "data_offset": 2048, 00:18:36.622 "data_size": 63488 00:18:36.622 }, 00:18:36.622 { 00:18:36.622 "name": "BaseBdev3", 00:18:36.622 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:36.622 "is_configured": true, 00:18:36.622 "data_offset": 2048, 00:18:36.622 "data_size": 63488 00:18:36.622 }, 00:18:36.622 { 00:18:36.622 "name": "BaseBdev4", 00:18:36.622 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:36.622 "is_configured": true, 00:18:36.622 "data_offset": 2048, 00:18:36.622 "data_size": 63488 00:18:36.622 } 00:18:36.622 ] 00:18:36.622 }' 00:18:36.622 13:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:36.622 13:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:37.191 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:37.191 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:37.191 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:37.191 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:37.191 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:37.191 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:37.191 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:37.191 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:37.191 [2024-07-12 13:44:25.771002] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:37.450 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:37.450 "name": "Existed_Raid", 00:18:37.450 "aliases": [ 00:18:37.450 "5c75330a-7d27-4b37-994d-3c637645f787" 00:18:37.450 ], 00:18:37.450 "product_name": "Raid Volume", 00:18:37.450 "block_size": 512, 00:18:37.450 "num_blocks": 253952, 00:18:37.450 "uuid": "5c75330a-7d27-4b37-994d-3c637645f787", 00:18:37.450 "assigned_rate_limits": { 00:18:37.450 "rw_ios_per_sec": 0, 00:18:37.450 "rw_mbytes_per_sec": 0, 00:18:37.450 "r_mbytes_per_sec": 0, 00:18:37.450 "w_mbytes_per_sec": 0 00:18:37.450 }, 00:18:37.450 "claimed": false, 00:18:37.450 "zoned": false, 00:18:37.450 "supported_io_types": { 00:18:37.450 "read": true, 00:18:37.450 "write": true, 00:18:37.450 "unmap": true, 00:18:37.450 "flush": true, 00:18:37.450 "reset": true, 00:18:37.450 "nvme_admin": false, 00:18:37.450 "nvme_io": false, 00:18:37.450 "nvme_io_md": false, 00:18:37.450 "write_zeroes": true, 00:18:37.450 "zcopy": false, 00:18:37.450 "get_zone_info": false, 00:18:37.450 "zone_management": false, 00:18:37.450 "zone_append": false, 00:18:37.450 "compare": false, 00:18:37.450 "compare_and_write": false, 00:18:37.450 "abort": false, 00:18:37.450 "seek_hole": false, 00:18:37.450 "seek_data": false, 00:18:37.450 "copy": false, 00:18:37.450 "nvme_iov_md": false 00:18:37.450 }, 00:18:37.450 "memory_domains": [ 00:18:37.450 { 00:18:37.450 "dma_device_id": "system", 00:18:37.450 "dma_device_type": 1 00:18:37.450 }, 00:18:37.450 { 00:18:37.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.450 "dma_device_type": 2 00:18:37.450 }, 00:18:37.450 { 00:18:37.450 "dma_device_id": "system", 00:18:37.450 "dma_device_type": 1 00:18:37.450 }, 00:18:37.450 { 00:18:37.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.450 "dma_device_type": 2 00:18:37.450 }, 00:18:37.451 { 00:18:37.451 "dma_device_id": "system", 00:18:37.451 "dma_device_type": 1 00:18:37.451 }, 00:18:37.451 { 00:18:37.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.451 "dma_device_type": 2 00:18:37.451 }, 00:18:37.451 { 00:18:37.451 "dma_device_id": "system", 00:18:37.451 "dma_device_type": 1 00:18:37.451 }, 00:18:37.451 { 00:18:37.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.451 "dma_device_type": 2 00:18:37.451 } 00:18:37.451 ], 00:18:37.451 "driver_specific": { 00:18:37.451 "raid": { 00:18:37.451 "uuid": "5c75330a-7d27-4b37-994d-3c637645f787", 00:18:37.451 "strip_size_kb": 64, 00:18:37.451 "state": "online", 00:18:37.451 "raid_level": "raid0", 00:18:37.451 "superblock": true, 00:18:37.451 "num_base_bdevs": 4, 00:18:37.451 "num_base_bdevs_discovered": 4, 00:18:37.451 "num_base_bdevs_operational": 4, 00:18:37.451 "base_bdevs_list": [ 00:18:37.451 { 00:18:37.451 "name": "NewBaseBdev", 00:18:37.451 "uuid": "ddebcc67-0411-46a2-9488-aebcd985f0a4", 00:18:37.451 "is_configured": true, 00:18:37.451 "data_offset": 2048, 00:18:37.451 "data_size": 63488 00:18:37.451 }, 00:18:37.451 { 00:18:37.451 "name": "BaseBdev2", 00:18:37.451 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:37.451 "is_configured": true, 00:18:37.451 "data_offset": 2048, 00:18:37.451 "data_size": 63488 00:18:37.451 }, 00:18:37.451 { 00:18:37.451 "name": "BaseBdev3", 00:18:37.451 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:37.451 "is_configured": true, 00:18:37.451 "data_offset": 2048, 00:18:37.451 "data_size": 63488 00:18:37.451 }, 00:18:37.451 { 00:18:37.451 "name": "BaseBdev4", 00:18:37.451 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:37.451 "is_configured": true, 00:18:37.451 "data_offset": 2048, 00:18:37.451 "data_size": 63488 00:18:37.451 } 00:18:37.451 ] 00:18:37.451 } 00:18:37.451 } 00:18:37.451 }' 00:18:37.451 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:37.451 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:37.451 BaseBdev2 00:18:37.451 BaseBdev3 00:18:37.451 BaseBdev4' 00:18:37.451 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:37.451 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:37.451 13:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:37.710 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:37.710 "name": "NewBaseBdev", 00:18:37.710 "aliases": [ 00:18:37.710 "ddebcc67-0411-46a2-9488-aebcd985f0a4" 00:18:37.710 ], 00:18:37.710 "product_name": "Malloc disk", 00:18:37.710 "block_size": 512, 00:18:37.710 "num_blocks": 65536, 00:18:37.710 "uuid": "ddebcc67-0411-46a2-9488-aebcd985f0a4", 00:18:37.710 "assigned_rate_limits": { 00:18:37.710 "rw_ios_per_sec": 0, 00:18:37.710 "rw_mbytes_per_sec": 0, 00:18:37.710 "r_mbytes_per_sec": 0, 00:18:37.710 "w_mbytes_per_sec": 0 00:18:37.710 }, 00:18:37.710 "claimed": true, 00:18:37.710 "claim_type": "exclusive_write", 00:18:37.710 "zoned": false, 00:18:37.710 "supported_io_types": { 00:18:37.710 "read": true, 00:18:37.710 "write": true, 00:18:37.710 "unmap": true, 00:18:37.710 "flush": true, 00:18:37.710 "reset": true, 00:18:37.710 "nvme_admin": false, 00:18:37.710 "nvme_io": false, 00:18:37.710 "nvme_io_md": false, 00:18:37.710 "write_zeroes": true, 00:18:37.710 "zcopy": true, 00:18:37.710 "get_zone_info": false, 00:18:37.710 "zone_management": false, 00:18:37.710 "zone_append": false, 00:18:37.710 "compare": false, 00:18:37.710 "compare_and_write": false, 00:18:37.710 "abort": true, 00:18:37.710 "seek_hole": false, 00:18:37.710 "seek_data": false, 00:18:37.710 "copy": true, 00:18:37.710 "nvme_iov_md": false 00:18:37.710 }, 00:18:37.710 "memory_domains": [ 00:18:37.710 { 00:18:37.710 "dma_device_id": "system", 00:18:37.710 "dma_device_type": 1 00:18:37.710 }, 00:18:37.710 { 00:18:37.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.711 "dma_device_type": 2 00:18:37.711 } 00:18:37.711 ], 00:18:37.711 "driver_specific": {} 00:18:37.711 }' 00:18:37.711 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.711 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.711 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:37.711 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.711 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.711 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:37.711 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.969 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.969 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:37.969 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.969 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.969 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:37.969 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:37.969 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:37.969 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:38.228 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:38.228 "name": "BaseBdev2", 00:18:38.228 "aliases": [ 00:18:38.228 "3be16ef8-16d8-47b8-98f4-661e312ad37c" 00:18:38.228 ], 00:18:38.228 "product_name": "Malloc disk", 00:18:38.228 "block_size": 512, 00:18:38.228 "num_blocks": 65536, 00:18:38.228 "uuid": "3be16ef8-16d8-47b8-98f4-661e312ad37c", 00:18:38.228 "assigned_rate_limits": { 00:18:38.228 "rw_ios_per_sec": 0, 00:18:38.228 "rw_mbytes_per_sec": 0, 00:18:38.228 "r_mbytes_per_sec": 0, 00:18:38.228 "w_mbytes_per_sec": 0 00:18:38.228 }, 00:18:38.228 "claimed": true, 00:18:38.228 "claim_type": "exclusive_write", 00:18:38.228 "zoned": false, 00:18:38.228 "supported_io_types": { 00:18:38.228 "read": true, 00:18:38.228 "write": true, 00:18:38.228 "unmap": true, 00:18:38.228 "flush": true, 00:18:38.228 "reset": true, 00:18:38.228 "nvme_admin": false, 00:18:38.228 "nvme_io": false, 00:18:38.228 "nvme_io_md": false, 00:18:38.228 "write_zeroes": true, 00:18:38.228 "zcopy": true, 00:18:38.228 "get_zone_info": false, 00:18:38.228 "zone_management": false, 00:18:38.228 "zone_append": false, 00:18:38.228 "compare": false, 00:18:38.228 "compare_and_write": false, 00:18:38.228 "abort": true, 00:18:38.228 "seek_hole": false, 00:18:38.228 "seek_data": false, 00:18:38.228 "copy": true, 00:18:38.228 "nvme_iov_md": false 00:18:38.228 }, 00:18:38.228 "memory_domains": [ 00:18:38.228 { 00:18:38.228 "dma_device_id": "system", 00:18:38.228 "dma_device_type": 1 00:18:38.228 }, 00:18:38.228 { 00:18:38.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.228 "dma_device_type": 2 00:18:38.228 } 00:18:38.228 ], 00:18:38.228 "driver_specific": {} 00:18:38.228 }' 00:18:38.229 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.229 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.229 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:38.229 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.487 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.487 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:38.487 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.487 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.487 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:38.487 13:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.487 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.746 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:38.746 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:38.746 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:38.746 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:38.746 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:38.746 "name": "BaseBdev3", 00:18:38.746 "aliases": [ 00:18:38.746 "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3" 00:18:38.746 ], 00:18:38.746 "product_name": "Malloc disk", 00:18:38.746 "block_size": 512, 00:18:38.746 "num_blocks": 65536, 00:18:38.746 "uuid": "7a7fee5a-2b4e-4b21-8cbf-bd52c9485de3", 00:18:38.746 "assigned_rate_limits": { 00:18:38.746 "rw_ios_per_sec": 0, 00:18:38.746 "rw_mbytes_per_sec": 0, 00:18:38.746 "r_mbytes_per_sec": 0, 00:18:38.746 "w_mbytes_per_sec": 0 00:18:38.746 }, 00:18:38.746 "claimed": true, 00:18:38.746 "claim_type": "exclusive_write", 00:18:38.746 "zoned": false, 00:18:38.746 "supported_io_types": { 00:18:38.746 "read": true, 00:18:38.746 "write": true, 00:18:38.746 "unmap": true, 00:18:38.746 "flush": true, 00:18:38.746 "reset": true, 00:18:38.746 "nvme_admin": false, 00:18:38.746 "nvme_io": false, 00:18:38.746 "nvme_io_md": false, 00:18:38.746 "write_zeroes": true, 00:18:38.746 "zcopy": true, 00:18:38.746 "get_zone_info": false, 00:18:38.746 "zone_management": false, 00:18:38.746 "zone_append": false, 00:18:38.746 "compare": false, 00:18:38.746 "compare_and_write": false, 00:18:38.746 "abort": true, 00:18:38.746 "seek_hole": false, 00:18:38.746 "seek_data": false, 00:18:38.746 "copy": true, 00:18:38.746 "nvme_iov_md": false 00:18:38.746 }, 00:18:38.746 "memory_domains": [ 00:18:38.746 { 00:18:38.746 "dma_device_id": "system", 00:18:38.746 "dma_device_type": 1 00:18:38.746 }, 00:18:38.746 { 00:18:38.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.746 "dma_device_type": 2 00:18:38.746 } 00:18:38.746 ], 00:18:38.746 "driver_specific": {} 00:18:38.746 }' 00:18:38.746 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.005 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.005 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:39.005 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.005 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.005 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:39.005 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.005 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.005 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:39.005 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.264 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.264 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:39.264 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:39.264 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:39.264 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:39.523 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:39.523 "name": "BaseBdev4", 00:18:39.523 "aliases": [ 00:18:39.523 "66f7d511-aea7-457e-84a4-ce1525b047b1" 00:18:39.523 ], 00:18:39.523 "product_name": "Malloc disk", 00:18:39.523 "block_size": 512, 00:18:39.523 "num_blocks": 65536, 00:18:39.523 "uuid": "66f7d511-aea7-457e-84a4-ce1525b047b1", 00:18:39.523 "assigned_rate_limits": { 00:18:39.523 "rw_ios_per_sec": 0, 00:18:39.523 "rw_mbytes_per_sec": 0, 00:18:39.523 "r_mbytes_per_sec": 0, 00:18:39.523 "w_mbytes_per_sec": 0 00:18:39.523 }, 00:18:39.523 "claimed": true, 00:18:39.523 "claim_type": "exclusive_write", 00:18:39.523 "zoned": false, 00:18:39.523 "supported_io_types": { 00:18:39.523 "read": true, 00:18:39.523 "write": true, 00:18:39.523 "unmap": true, 00:18:39.523 "flush": true, 00:18:39.523 "reset": true, 00:18:39.523 "nvme_admin": false, 00:18:39.523 "nvme_io": false, 00:18:39.523 "nvme_io_md": false, 00:18:39.523 "write_zeroes": true, 00:18:39.523 "zcopy": true, 00:18:39.523 "get_zone_info": false, 00:18:39.523 "zone_management": false, 00:18:39.523 "zone_append": false, 00:18:39.523 "compare": false, 00:18:39.523 "compare_and_write": false, 00:18:39.523 "abort": true, 00:18:39.523 "seek_hole": false, 00:18:39.523 "seek_data": false, 00:18:39.523 "copy": true, 00:18:39.523 "nvme_iov_md": false 00:18:39.523 }, 00:18:39.523 "memory_domains": [ 00:18:39.523 { 00:18:39.523 "dma_device_id": "system", 00:18:39.523 "dma_device_type": 1 00:18:39.523 }, 00:18:39.523 { 00:18:39.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:39.523 "dma_device_type": 2 00:18:39.523 } 00:18:39.523 ], 00:18:39.523 "driver_specific": {} 00:18:39.523 }' 00:18:39.523 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.523 13:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.523 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:39.523 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.523 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.523 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:39.523 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.782 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.782 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:39.782 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.782 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.782 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:39.782 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:40.042 [2024-07-12 13:44:28.489906] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:40.042 [2024-07-12 13:44:28.489944] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:40.042 [2024-07-12 13:44:28.490007] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:40.042 [2024-07-12 13:44:28.490066] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:40.042 [2024-07-12 13:44:28.490078] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19892d0 name Existed_Raid, state offline 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 495009 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 495009 ']' 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 495009 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 495009 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 495009' 00:18:40.042 killing process with pid 495009 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 495009 00:18:40.042 [2024-07-12 13:44:28.556198] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:40.042 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 495009 00:18:40.042 [2024-07-12 13:44:28.598281] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:40.301 13:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:40.301 00:18:40.301 real 0m33.172s 00:18:40.301 user 1m0.845s 00:18:40.301 sys 0m5.935s 00:18:40.301 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:40.301 13:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:40.301 ************************************ 00:18:40.301 END TEST raid_state_function_test_sb 00:18:40.301 ************************************ 00:18:40.301 13:44:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:40.301 13:44:28 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:18:40.301 13:44:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:40.301 13:44:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:40.301 13:44:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:40.561 ************************************ 00:18:40.561 START TEST raid_superblock_test 00:18:40.561 ************************************ 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=500042 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 500042 /var/tmp/spdk-raid.sock 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 500042 ']' 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:40.561 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:40.561 13:44:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.561 [2024-07-12 13:44:28.979663] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:18:40.561 [2024-07-12 13:44:28.979730] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid500042 ] 00:18:40.561 [2024-07-12 13:44:29.110522] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:40.819 [2024-07-12 13:44:29.218063] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:40.819 [2024-07-12 13:44:29.288939] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:40.819 [2024-07-12 13:44:29.288980] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:41.387 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:41.646 malloc1 00:18:41.646 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:41.905 [2024-07-12 13:44:30.396363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:41.905 [2024-07-12 13:44:30.396411] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:41.905 [2024-07-12 13:44:30.396432] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7d6e90 00:18:41.905 [2024-07-12 13:44:30.396445] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:41.905 [2024-07-12 13:44:30.398123] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:41.905 [2024-07-12 13:44:30.398153] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:41.905 pt1 00:18:41.905 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:41.905 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:41.905 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:41.905 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:41.905 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:41.905 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:41.905 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:41.905 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:41.905 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:42.163 malloc2 00:18:42.163 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:42.422 [2024-07-12 13:44:30.883609] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:42.422 [2024-07-12 13:44:30.883655] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:42.422 [2024-07-12 13:44:30.883679] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x874fb0 00:18:42.422 [2024-07-12 13:44:30.883691] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:42.422 [2024-07-12 13:44:30.885229] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:42.422 [2024-07-12 13:44:30.885257] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:42.422 pt2 00:18:42.422 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:42.422 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:42.422 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:42.422 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:42.422 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:42.422 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:42.422 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:42.422 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:42.422 13:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:42.681 malloc3 00:18:42.681 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:42.940 [2024-07-12 13:44:31.382696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:42.940 [2024-07-12 13:44:31.382741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:42.940 [2024-07-12 13:44:31.382759] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x875ce0 00:18:42.940 [2024-07-12 13:44:31.382771] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:42.940 [2024-07-12 13:44:31.384311] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:42.940 [2024-07-12 13:44:31.384339] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:42.940 pt3 00:18:42.940 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:42.940 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:42.940 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:42.940 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:42.940 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:42.940 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:42.940 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:42.940 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:42.940 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:43.200 malloc4 00:18:43.200 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:43.459 [2024-07-12 13:44:31.876618] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:43.459 [2024-07-12 13:44:31.876664] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:43.459 [2024-07-12 13:44:31.876682] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x878450 00:18:43.459 [2024-07-12 13:44:31.876695] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:43.459 [2024-07-12 13:44:31.878236] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:43.459 [2024-07-12 13:44:31.878270] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:43.459 pt4 00:18:43.459 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:43.459 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:43.459 13:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:43.719 [2024-07-12 13:44:32.049100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:43.719 [2024-07-12 13:44:32.050288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:43.719 [2024-07-12 13:44:32.050345] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:43.719 [2024-07-12 13:44:32.050390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:43.719 [2024-07-12 13:44:32.050553] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x7d9c20 00:18:43.719 [2024-07-12 13:44:32.050564] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:43.719 [2024-07-12 13:44:32.050756] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x87a910 00:18:43.719 [2024-07-12 13:44:32.050899] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7d9c20 00:18:43.719 [2024-07-12 13:44:32.050909] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x7d9c20 00:18:43.719 [2024-07-12 13:44:32.051012] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.719 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:43.978 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.978 "name": "raid_bdev1", 00:18:43.978 "uuid": "1ec1afa0-b97c-46b6-a998-206888e6b6c6", 00:18:43.978 "strip_size_kb": 64, 00:18:43.978 "state": "online", 00:18:43.978 "raid_level": "raid0", 00:18:43.978 "superblock": true, 00:18:43.978 "num_base_bdevs": 4, 00:18:43.978 "num_base_bdevs_discovered": 4, 00:18:43.978 "num_base_bdevs_operational": 4, 00:18:43.978 "base_bdevs_list": [ 00:18:43.978 { 00:18:43.978 "name": "pt1", 00:18:43.978 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:43.978 "is_configured": true, 00:18:43.978 "data_offset": 2048, 00:18:43.978 "data_size": 63488 00:18:43.978 }, 00:18:43.978 { 00:18:43.978 "name": "pt2", 00:18:43.978 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:43.978 "is_configured": true, 00:18:43.978 "data_offset": 2048, 00:18:43.978 "data_size": 63488 00:18:43.978 }, 00:18:43.978 { 00:18:43.978 "name": "pt3", 00:18:43.978 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:43.978 "is_configured": true, 00:18:43.978 "data_offset": 2048, 00:18:43.978 "data_size": 63488 00:18:43.978 }, 00:18:43.978 { 00:18:43.978 "name": "pt4", 00:18:43.978 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:43.978 "is_configured": true, 00:18:43.978 "data_offset": 2048, 00:18:43.978 "data_size": 63488 00:18:43.978 } 00:18:43.978 ] 00:18:43.978 }' 00:18:43.978 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.978 13:44:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:44.546 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:44.546 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:44.546 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:44.546 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:44.546 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:44.546 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:44.546 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:44.546 13:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:44.805 [2024-07-12 13:44:33.140267] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:44.805 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:44.805 "name": "raid_bdev1", 00:18:44.805 "aliases": [ 00:18:44.805 "1ec1afa0-b97c-46b6-a998-206888e6b6c6" 00:18:44.805 ], 00:18:44.805 "product_name": "Raid Volume", 00:18:44.805 "block_size": 512, 00:18:44.805 "num_blocks": 253952, 00:18:44.805 "uuid": "1ec1afa0-b97c-46b6-a998-206888e6b6c6", 00:18:44.805 "assigned_rate_limits": { 00:18:44.805 "rw_ios_per_sec": 0, 00:18:44.805 "rw_mbytes_per_sec": 0, 00:18:44.805 "r_mbytes_per_sec": 0, 00:18:44.805 "w_mbytes_per_sec": 0 00:18:44.805 }, 00:18:44.805 "claimed": false, 00:18:44.805 "zoned": false, 00:18:44.805 "supported_io_types": { 00:18:44.805 "read": true, 00:18:44.805 "write": true, 00:18:44.805 "unmap": true, 00:18:44.805 "flush": true, 00:18:44.805 "reset": true, 00:18:44.805 "nvme_admin": false, 00:18:44.805 "nvme_io": false, 00:18:44.805 "nvme_io_md": false, 00:18:44.805 "write_zeroes": true, 00:18:44.805 "zcopy": false, 00:18:44.805 "get_zone_info": false, 00:18:44.805 "zone_management": false, 00:18:44.805 "zone_append": false, 00:18:44.805 "compare": false, 00:18:44.805 "compare_and_write": false, 00:18:44.805 "abort": false, 00:18:44.805 "seek_hole": false, 00:18:44.805 "seek_data": false, 00:18:44.805 "copy": false, 00:18:44.805 "nvme_iov_md": false 00:18:44.805 }, 00:18:44.805 "memory_domains": [ 00:18:44.805 { 00:18:44.805 "dma_device_id": "system", 00:18:44.805 "dma_device_type": 1 00:18:44.805 }, 00:18:44.805 { 00:18:44.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.805 "dma_device_type": 2 00:18:44.805 }, 00:18:44.805 { 00:18:44.805 "dma_device_id": "system", 00:18:44.805 "dma_device_type": 1 00:18:44.805 }, 00:18:44.805 { 00:18:44.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.805 "dma_device_type": 2 00:18:44.805 }, 00:18:44.805 { 00:18:44.805 "dma_device_id": "system", 00:18:44.805 "dma_device_type": 1 00:18:44.805 }, 00:18:44.805 { 00:18:44.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.805 "dma_device_type": 2 00:18:44.805 }, 00:18:44.805 { 00:18:44.805 "dma_device_id": "system", 00:18:44.805 "dma_device_type": 1 00:18:44.805 }, 00:18:44.805 { 00:18:44.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.805 "dma_device_type": 2 00:18:44.805 } 00:18:44.805 ], 00:18:44.805 "driver_specific": { 00:18:44.805 "raid": { 00:18:44.805 "uuid": "1ec1afa0-b97c-46b6-a998-206888e6b6c6", 00:18:44.805 "strip_size_kb": 64, 00:18:44.805 "state": "online", 00:18:44.805 "raid_level": "raid0", 00:18:44.805 "superblock": true, 00:18:44.805 "num_base_bdevs": 4, 00:18:44.805 "num_base_bdevs_discovered": 4, 00:18:44.805 "num_base_bdevs_operational": 4, 00:18:44.805 "base_bdevs_list": [ 00:18:44.805 { 00:18:44.805 "name": "pt1", 00:18:44.805 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:44.805 "is_configured": true, 00:18:44.805 "data_offset": 2048, 00:18:44.805 "data_size": 63488 00:18:44.805 }, 00:18:44.805 { 00:18:44.805 "name": "pt2", 00:18:44.805 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:44.805 "is_configured": true, 00:18:44.805 "data_offset": 2048, 00:18:44.805 "data_size": 63488 00:18:44.805 }, 00:18:44.805 { 00:18:44.805 "name": "pt3", 00:18:44.805 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:44.805 "is_configured": true, 00:18:44.805 "data_offset": 2048, 00:18:44.805 "data_size": 63488 00:18:44.805 }, 00:18:44.805 { 00:18:44.805 "name": "pt4", 00:18:44.805 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:44.805 "is_configured": true, 00:18:44.805 "data_offset": 2048, 00:18:44.805 "data_size": 63488 00:18:44.805 } 00:18:44.805 ] 00:18:44.805 } 00:18:44.805 } 00:18:44.805 }' 00:18:44.805 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:44.805 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:44.805 pt2 00:18:44.805 pt3 00:18:44.805 pt4' 00:18:44.805 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:44.805 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:44.805 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:45.064 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:45.064 "name": "pt1", 00:18:45.064 "aliases": [ 00:18:45.064 "00000000-0000-0000-0000-000000000001" 00:18:45.064 ], 00:18:45.064 "product_name": "passthru", 00:18:45.064 "block_size": 512, 00:18:45.064 "num_blocks": 65536, 00:18:45.064 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:45.064 "assigned_rate_limits": { 00:18:45.064 "rw_ios_per_sec": 0, 00:18:45.064 "rw_mbytes_per_sec": 0, 00:18:45.064 "r_mbytes_per_sec": 0, 00:18:45.064 "w_mbytes_per_sec": 0 00:18:45.064 }, 00:18:45.064 "claimed": true, 00:18:45.064 "claim_type": "exclusive_write", 00:18:45.064 "zoned": false, 00:18:45.064 "supported_io_types": { 00:18:45.064 "read": true, 00:18:45.064 "write": true, 00:18:45.064 "unmap": true, 00:18:45.064 "flush": true, 00:18:45.064 "reset": true, 00:18:45.064 "nvme_admin": false, 00:18:45.064 "nvme_io": false, 00:18:45.064 "nvme_io_md": false, 00:18:45.064 "write_zeroes": true, 00:18:45.064 "zcopy": true, 00:18:45.064 "get_zone_info": false, 00:18:45.064 "zone_management": false, 00:18:45.064 "zone_append": false, 00:18:45.064 "compare": false, 00:18:45.064 "compare_and_write": false, 00:18:45.064 "abort": true, 00:18:45.064 "seek_hole": false, 00:18:45.064 "seek_data": false, 00:18:45.064 "copy": true, 00:18:45.064 "nvme_iov_md": false 00:18:45.064 }, 00:18:45.064 "memory_domains": [ 00:18:45.064 { 00:18:45.064 "dma_device_id": "system", 00:18:45.064 "dma_device_type": 1 00:18:45.064 }, 00:18:45.064 { 00:18:45.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.064 "dma_device_type": 2 00:18:45.064 } 00:18:45.064 ], 00:18:45.064 "driver_specific": { 00:18:45.064 "passthru": { 00:18:45.064 "name": "pt1", 00:18:45.064 "base_bdev_name": "malloc1" 00:18:45.064 } 00:18:45.064 } 00:18:45.064 }' 00:18:45.064 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.064 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.064 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:45.064 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.064 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.064 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:45.064 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.323 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.323 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:45.323 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.323 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.323 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:45.323 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:45.323 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:45.323 13:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:45.582 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:45.582 "name": "pt2", 00:18:45.582 "aliases": [ 00:18:45.582 "00000000-0000-0000-0000-000000000002" 00:18:45.582 ], 00:18:45.582 "product_name": "passthru", 00:18:45.582 "block_size": 512, 00:18:45.582 "num_blocks": 65536, 00:18:45.582 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:45.582 "assigned_rate_limits": { 00:18:45.582 "rw_ios_per_sec": 0, 00:18:45.582 "rw_mbytes_per_sec": 0, 00:18:45.582 "r_mbytes_per_sec": 0, 00:18:45.582 "w_mbytes_per_sec": 0 00:18:45.582 }, 00:18:45.582 "claimed": true, 00:18:45.582 "claim_type": "exclusive_write", 00:18:45.582 "zoned": false, 00:18:45.582 "supported_io_types": { 00:18:45.582 "read": true, 00:18:45.582 "write": true, 00:18:45.582 "unmap": true, 00:18:45.582 "flush": true, 00:18:45.582 "reset": true, 00:18:45.582 "nvme_admin": false, 00:18:45.582 "nvme_io": false, 00:18:45.582 "nvme_io_md": false, 00:18:45.582 "write_zeroes": true, 00:18:45.582 "zcopy": true, 00:18:45.582 "get_zone_info": false, 00:18:45.582 "zone_management": false, 00:18:45.582 "zone_append": false, 00:18:45.582 "compare": false, 00:18:45.582 "compare_and_write": false, 00:18:45.582 "abort": true, 00:18:45.582 "seek_hole": false, 00:18:45.582 "seek_data": false, 00:18:45.582 "copy": true, 00:18:45.582 "nvme_iov_md": false 00:18:45.582 }, 00:18:45.582 "memory_domains": [ 00:18:45.582 { 00:18:45.582 "dma_device_id": "system", 00:18:45.582 "dma_device_type": 1 00:18:45.582 }, 00:18:45.582 { 00:18:45.582 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.582 "dma_device_type": 2 00:18:45.582 } 00:18:45.582 ], 00:18:45.582 "driver_specific": { 00:18:45.582 "passthru": { 00:18:45.582 "name": "pt2", 00:18:45.582 "base_bdev_name": "malloc2" 00:18:45.582 } 00:18:45.582 } 00:18:45.582 }' 00:18:45.582 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.583 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.583 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:45.583 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:45.842 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:46.101 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.101 "name": "pt3", 00:18:46.101 "aliases": [ 00:18:46.101 "00000000-0000-0000-0000-000000000003" 00:18:46.101 ], 00:18:46.101 "product_name": "passthru", 00:18:46.101 "block_size": 512, 00:18:46.101 "num_blocks": 65536, 00:18:46.101 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:46.101 "assigned_rate_limits": { 00:18:46.101 "rw_ios_per_sec": 0, 00:18:46.101 "rw_mbytes_per_sec": 0, 00:18:46.101 "r_mbytes_per_sec": 0, 00:18:46.101 "w_mbytes_per_sec": 0 00:18:46.101 }, 00:18:46.101 "claimed": true, 00:18:46.101 "claim_type": "exclusive_write", 00:18:46.101 "zoned": false, 00:18:46.101 "supported_io_types": { 00:18:46.101 "read": true, 00:18:46.101 "write": true, 00:18:46.101 "unmap": true, 00:18:46.101 "flush": true, 00:18:46.101 "reset": true, 00:18:46.101 "nvme_admin": false, 00:18:46.101 "nvme_io": false, 00:18:46.101 "nvme_io_md": false, 00:18:46.101 "write_zeroes": true, 00:18:46.101 "zcopy": true, 00:18:46.101 "get_zone_info": false, 00:18:46.101 "zone_management": false, 00:18:46.101 "zone_append": false, 00:18:46.101 "compare": false, 00:18:46.101 "compare_and_write": false, 00:18:46.101 "abort": true, 00:18:46.101 "seek_hole": false, 00:18:46.101 "seek_data": false, 00:18:46.101 "copy": true, 00:18:46.101 "nvme_iov_md": false 00:18:46.101 }, 00:18:46.101 "memory_domains": [ 00:18:46.101 { 00:18:46.101 "dma_device_id": "system", 00:18:46.101 "dma_device_type": 1 00:18:46.101 }, 00:18:46.101 { 00:18:46.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.101 "dma_device_type": 2 00:18:46.101 } 00:18:46.101 ], 00:18:46.101 "driver_specific": { 00:18:46.101 "passthru": { 00:18:46.101 "name": "pt3", 00:18:46.101 "base_bdev_name": "malloc3" 00:18:46.101 } 00:18:46.101 } 00:18:46.101 }' 00:18:46.101 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.359 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.359 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.359 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.359 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.359 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.359 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.359 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.359 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:46.359 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.618 13:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.618 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:46.618 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:46.618 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:46.618 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:46.877 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.877 "name": "pt4", 00:18:46.877 "aliases": [ 00:18:46.877 "00000000-0000-0000-0000-000000000004" 00:18:46.877 ], 00:18:46.877 "product_name": "passthru", 00:18:46.877 "block_size": 512, 00:18:46.877 "num_blocks": 65536, 00:18:46.877 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:46.877 "assigned_rate_limits": { 00:18:46.877 "rw_ios_per_sec": 0, 00:18:46.877 "rw_mbytes_per_sec": 0, 00:18:46.877 "r_mbytes_per_sec": 0, 00:18:46.877 "w_mbytes_per_sec": 0 00:18:46.877 }, 00:18:46.877 "claimed": true, 00:18:46.877 "claim_type": "exclusive_write", 00:18:46.877 "zoned": false, 00:18:46.877 "supported_io_types": { 00:18:46.877 "read": true, 00:18:46.877 "write": true, 00:18:46.877 "unmap": true, 00:18:46.877 "flush": true, 00:18:46.877 "reset": true, 00:18:46.877 "nvme_admin": false, 00:18:46.877 "nvme_io": false, 00:18:46.877 "nvme_io_md": false, 00:18:46.877 "write_zeroes": true, 00:18:46.877 "zcopy": true, 00:18:46.877 "get_zone_info": false, 00:18:46.877 "zone_management": false, 00:18:46.877 "zone_append": false, 00:18:46.877 "compare": false, 00:18:46.877 "compare_and_write": false, 00:18:46.877 "abort": true, 00:18:46.877 "seek_hole": false, 00:18:46.877 "seek_data": false, 00:18:46.877 "copy": true, 00:18:46.877 "nvme_iov_md": false 00:18:46.877 }, 00:18:46.877 "memory_domains": [ 00:18:46.877 { 00:18:46.877 "dma_device_id": "system", 00:18:46.877 "dma_device_type": 1 00:18:46.877 }, 00:18:46.877 { 00:18:46.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.877 "dma_device_type": 2 00:18:46.877 } 00:18:46.877 ], 00:18:46.877 "driver_specific": { 00:18:46.877 "passthru": { 00:18:46.877 "name": "pt4", 00:18:46.877 "base_bdev_name": "malloc4" 00:18:46.877 } 00:18:46.877 } 00:18:46.877 }' 00:18:46.877 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.877 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.877 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.877 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.877 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.877 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.877 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.136 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.136 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:47.136 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.136 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.136 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:47.136 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:47.136 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:47.395 [2024-07-12 13:44:35.843454] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:47.395 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1ec1afa0-b97c-46b6-a998-206888e6b6c6 00:18:47.396 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 1ec1afa0-b97c-46b6-a998-206888e6b6c6 ']' 00:18:47.396 13:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:47.655 [2024-07-12 13:44:36.087789] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:47.655 [2024-07-12 13:44:36.087813] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:47.655 [2024-07-12 13:44:36.087864] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:47.655 [2024-07-12 13:44:36.087946] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:47.655 [2024-07-12 13:44:36.087958] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d9c20 name raid_bdev1, state offline 00:18:47.655 13:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.655 13:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:47.914 13:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:47.914 13:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:47.914 13:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:47.914 13:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:48.173 13:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:48.173 13:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:48.431 13:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:48.431 13:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:48.690 13:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:48.690 13:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:48.948 13:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:48.948 13:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:49.208 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:49.467 [2024-07-12 13:44:37.800252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:49.467 [2024-07-12 13:44:37.801640] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:49.467 [2024-07-12 13:44:37.801690] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:49.467 [2024-07-12 13:44:37.801724] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:49.467 [2024-07-12 13:44:37.801771] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:49.467 [2024-07-12 13:44:37.801809] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:49.467 [2024-07-12 13:44:37.801832] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:49.467 [2024-07-12 13:44:37.801854] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:49.467 [2024-07-12 13:44:37.801872] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:49.467 [2024-07-12 13:44:37.801882] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x879a40 name raid_bdev1, state configuring 00:18:49.467 request: 00:18:49.467 { 00:18:49.467 "name": "raid_bdev1", 00:18:49.467 "raid_level": "raid0", 00:18:49.467 "base_bdevs": [ 00:18:49.467 "malloc1", 00:18:49.467 "malloc2", 00:18:49.467 "malloc3", 00:18:49.467 "malloc4" 00:18:49.467 ], 00:18:49.467 "strip_size_kb": 64, 00:18:49.467 "superblock": false, 00:18:49.467 "method": "bdev_raid_create", 00:18:49.467 "req_id": 1 00:18:49.467 } 00:18:49.467 Got JSON-RPC error response 00:18:49.467 response: 00:18:49.467 { 00:18:49.467 "code": -17, 00:18:49.468 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:49.468 } 00:18:49.468 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:49.468 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:49.468 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:49.468 13:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:49.468 13:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.468 13:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:49.727 [2024-07-12 13:44:38.277440] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:49.727 [2024-07-12 13:44:38.277488] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:49.727 [2024-07-12 13:44:38.277506] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7d99b0 00:18:49.727 [2024-07-12 13:44:38.277519] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:49.727 [2024-07-12 13:44:38.279165] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:49.727 [2024-07-12 13:44:38.279193] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:49.727 [2024-07-12 13:44:38.279257] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:49.727 [2024-07-12 13:44:38.279281] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:49.727 pt1 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.727 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:49.987 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:49.987 "name": "raid_bdev1", 00:18:49.987 "uuid": "1ec1afa0-b97c-46b6-a998-206888e6b6c6", 00:18:49.987 "strip_size_kb": 64, 00:18:49.987 "state": "configuring", 00:18:49.987 "raid_level": "raid0", 00:18:49.987 "superblock": true, 00:18:49.987 "num_base_bdevs": 4, 00:18:49.987 "num_base_bdevs_discovered": 1, 00:18:49.987 "num_base_bdevs_operational": 4, 00:18:49.987 "base_bdevs_list": [ 00:18:49.987 { 00:18:49.987 "name": "pt1", 00:18:49.987 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:49.987 "is_configured": true, 00:18:49.987 "data_offset": 2048, 00:18:49.987 "data_size": 63488 00:18:49.987 }, 00:18:49.987 { 00:18:49.987 "name": null, 00:18:49.987 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:49.987 "is_configured": false, 00:18:49.987 "data_offset": 2048, 00:18:49.987 "data_size": 63488 00:18:49.987 }, 00:18:49.987 { 00:18:49.987 "name": null, 00:18:49.987 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:49.987 "is_configured": false, 00:18:49.987 "data_offset": 2048, 00:18:49.987 "data_size": 63488 00:18:49.987 }, 00:18:49.987 { 00:18:49.987 "name": null, 00:18:49.987 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:49.987 "is_configured": false, 00:18:49.987 "data_offset": 2048, 00:18:49.987 "data_size": 63488 00:18:49.987 } 00:18:49.987 ] 00:18:49.987 }' 00:18:49.987 13:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:49.987 13:44:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:50.925 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:50.925 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:51.184 [2024-07-12 13:44:39.508714] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:51.184 [2024-07-12 13:44:39.508768] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:51.184 [2024-07-12 13:44:39.508788] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x878fa0 00:18:51.184 [2024-07-12 13:44:39.508800] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:51.184 [2024-07-12 13:44:39.509163] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:51.184 [2024-07-12 13:44:39.509184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:51.184 [2024-07-12 13:44:39.509250] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:51.184 [2024-07-12 13:44:39.509269] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:51.184 pt2 00:18:51.184 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:51.184 [2024-07-12 13:44:39.753373] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:51.443 13:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.443 13:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.443 "name": "raid_bdev1", 00:18:51.443 "uuid": "1ec1afa0-b97c-46b6-a998-206888e6b6c6", 00:18:51.443 "strip_size_kb": 64, 00:18:51.443 "state": "configuring", 00:18:51.443 "raid_level": "raid0", 00:18:51.443 "superblock": true, 00:18:51.443 "num_base_bdevs": 4, 00:18:51.443 "num_base_bdevs_discovered": 1, 00:18:51.443 "num_base_bdevs_operational": 4, 00:18:51.443 "base_bdevs_list": [ 00:18:51.443 { 00:18:51.443 "name": "pt1", 00:18:51.443 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:51.443 "is_configured": true, 00:18:51.443 "data_offset": 2048, 00:18:51.443 "data_size": 63488 00:18:51.443 }, 00:18:51.443 { 00:18:51.443 "name": null, 00:18:51.443 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:51.443 "is_configured": false, 00:18:51.443 "data_offset": 2048, 00:18:51.443 "data_size": 63488 00:18:51.443 }, 00:18:51.443 { 00:18:51.443 "name": null, 00:18:51.443 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:51.443 "is_configured": false, 00:18:51.443 "data_offset": 2048, 00:18:51.443 "data_size": 63488 00:18:51.443 }, 00:18:51.443 { 00:18:51.443 "name": null, 00:18:51.443 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:51.443 "is_configured": false, 00:18:51.443 "data_offset": 2048, 00:18:51.443 "data_size": 63488 00:18:51.443 } 00:18:51.443 ] 00:18:51.443 }' 00:18:51.443 13:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.443 13:44:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:52.387 13:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:52.387 13:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:52.387 13:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:52.387 [2024-07-12 13:44:40.844255] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:52.387 [2024-07-12 13:44:40.844306] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.387 [2024-07-12 13:44:40.844326] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7d7ab0 00:18:52.387 [2024-07-12 13:44:40.844338] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.387 [2024-07-12 13:44:40.844676] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.387 [2024-07-12 13:44:40.844694] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:52.387 [2024-07-12 13:44:40.844756] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:52.387 [2024-07-12 13:44:40.844774] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:52.387 pt2 00:18:52.387 13:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:52.387 13:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:52.387 13:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:52.646 [2024-07-12 13:44:41.092916] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:52.647 [2024-07-12 13:44:41.092961] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.647 [2024-07-12 13:44:41.092977] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8795d0 00:18:52.647 [2024-07-12 13:44:41.092989] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.647 [2024-07-12 13:44:41.093280] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.647 [2024-07-12 13:44:41.093297] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:52.647 [2024-07-12 13:44:41.093348] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:52.647 [2024-07-12 13:44:41.093371] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:52.647 pt3 00:18:52.647 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:52.647 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:52.647 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:52.906 [2024-07-12 13:44:41.277406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:52.906 [2024-07-12 13:44:41.277441] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.906 [2024-07-12 13:44:41.277456] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7cdf60 00:18:52.906 [2024-07-12 13:44:41.277469] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.906 [2024-07-12 13:44:41.277771] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.906 [2024-07-12 13:44:41.277789] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:52.906 [2024-07-12 13:44:41.277844] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:52.906 [2024-07-12 13:44:41.277863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:52.906 [2024-07-12 13:44:41.277986] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x87a4b0 00:18:52.906 [2024-07-12 13:44:41.277997] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:52.906 [2024-07-12 13:44:41.278170] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x87dc40 00:18:52.906 [2024-07-12 13:44:41.278295] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x87a4b0 00:18:52.906 [2024-07-12 13:44:41.278305] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x87a4b0 00:18:52.906 [2024-07-12 13:44:41.278403] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:52.906 pt4 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.906 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:53.166 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.166 "name": "raid_bdev1", 00:18:53.166 "uuid": "1ec1afa0-b97c-46b6-a998-206888e6b6c6", 00:18:53.166 "strip_size_kb": 64, 00:18:53.166 "state": "online", 00:18:53.166 "raid_level": "raid0", 00:18:53.166 "superblock": true, 00:18:53.166 "num_base_bdevs": 4, 00:18:53.166 "num_base_bdevs_discovered": 4, 00:18:53.166 "num_base_bdevs_operational": 4, 00:18:53.166 "base_bdevs_list": [ 00:18:53.166 { 00:18:53.166 "name": "pt1", 00:18:53.166 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:53.166 "is_configured": true, 00:18:53.166 "data_offset": 2048, 00:18:53.166 "data_size": 63488 00:18:53.166 }, 00:18:53.166 { 00:18:53.166 "name": "pt2", 00:18:53.166 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:53.166 "is_configured": true, 00:18:53.166 "data_offset": 2048, 00:18:53.166 "data_size": 63488 00:18:53.166 }, 00:18:53.166 { 00:18:53.166 "name": "pt3", 00:18:53.166 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:53.166 "is_configured": true, 00:18:53.166 "data_offset": 2048, 00:18:53.166 "data_size": 63488 00:18:53.166 }, 00:18:53.166 { 00:18:53.166 "name": "pt4", 00:18:53.166 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:53.166 "is_configured": true, 00:18:53.166 "data_offset": 2048, 00:18:53.166 "data_size": 63488 00:18:53.166 } 00:18:53.166 ] 00:18:53.166 }' 00:18:53.166 13:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.166 13:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:53.734 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:53.734 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:53.734 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:53.734 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:53.734 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:53.734 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:53.734 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:53.734 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:53.993 [2024-07-12 13:44:42.368624] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:53.993 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:53.993 "name": "raid_bdev1", 00:18:53.993 "aliases": [ 00:18:53.993 "1ec1afa0-b97c-46b6-a998-206888e6b6c6" 00:18:53.993 ], 00:18:53.993 "product_name": "Raid Volume", 00:18:53.993 "block_size": 512, 00:18:53.993 "num_blocks": 253952, 00:18:53.993 "uuid": "1ec1afa0-b97c-46b6-a998-206888e6b6c6", 00:18:53.993 "assigned_rate_limits": { 00:18:53.993 "rw_ios_per_sec": 0, 00:18:53.993 "rw_mbytes_per_sec": 0, 00:18:53.993 "r_mbytes_per_sec": 0, 00:18:53.993 "w_mbytes_per_sec": 0 00:18:53.993 }, 00:18:53.993 "claimed": false, 00:18:53.993 "zoned": false, 00:18:53.993 "supported_io_types": { 00:18:53.993 "read": true, 00:18:53.993 "write": true, 00:18:53.993 "unmap": true, 00:18:53.993 "flush": true, 00:18:53.993 "reset": true, 00:18:53.993 "nvme_admin": false, 00:18:53.993 "nvme_io": false, 00:18:53.993 "nvme_io_md": false, 00:18:53.993 "write_zeroes": true, 00:18:53.993 "zcopy": false, 00:18:53.993 "get_zone_info": false, 00:18:53.993 "zone_management": false, 00:18:53.993 "zone_append": false, 00:18:53.993 "compare": false, 00:18:53.993 "compare_and_write": false, 00:18:53.993 "abort": false, 00:18:53.993 "seek_hole": false, 00:18:53.993 "seek_data": false, 00:18:53.993 "copy": false, 00:18:53.993 "nvme_iov_md": false 00:18:53.993 }, 00:18:53.993 "memory_domains": [ 00:18:53.993 { 00:18:53.993 "dma_device_id": "system", 00:18:53.993 "dma_device_type": 1 00:18:53.993 }, 00:18:53.993 { 00:18:53.993 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.993 "dma_device_type": 2 00:18:53.993 }, 00:18:53.993 { 00:18:53.993 "dma_device_id": "system", 00:18:53.993 "dma_device_type": 1 00:18:53.993 }, 00:18:53.993 { 00:18:53.993 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.993 "dma_device_type": 2 00:18:53.993 }, 00:18:53.993 { 00:18:53.993 "dma_device_id": "system", 00:18:53.993 "dma_device_type": 1 00:18:53.993 }, 00:18:53.993 { 00:18:53.993 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.993 "dma_device_type": 2 00:18:53.993 }, 00:18:53.993 { 00:18:53.993 "dma_device_id": "system", 00:18:53.993 "dma_device_type": 1 00:18:53.993 }, 00:18:53.993 { 00:18:53.993 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.993 "dma_device_type": 2 00:18:53.993 } 00:18:53.993 ], 00:18:53.993 "driver_specific": { 00:18:53.993 "raid": { 00:18:53.993 "uuid": "1ec1afa0-b97c-46b6-a998-206888e6b6c6", 00:18:53.993 "strip_size_kb": 64, 00:18:53.993 "state": "online", 00:18:53.993 "raid_level": "raid0", 00:18:53.993 "superblock": true, 00:18:53.993 "num_base_bdevs": 4, 00:18:53.993 "num_base_bdevs_discovered": 4, 00:18:53.993 "num_base_bdevs_operational": 4, 00:18:53.993 "base_bdevs_list": [ 00:18:53.993 { 00:18:53.993 "name": "pt1", 00:18:53.993 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:53.993 "is_configured": true, 00:18:53.993 "data_offset": 2048, 00:18:53.993 "data_size": 63488 00:18:53.993 }, 00:18:53.993 { 00:18:53.993 "name": "pt2", 00:18:53.994 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:53.994 "is_configured": true, 00:18:53.994 "data_offset": 2048, 00:18:53.994 "data_size": 63488 00:18:53.994 }, 00:18:53.994 { 00:18:53.994 "name": "pt3", 00:18:53.994 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:53.994 "is_configured": true, 00:18:53.994 "data_offset": 2048, 00:18:53.994 "data_size": 63488 00:18:53.994 }, 00:18:53.994 { 00:18:53.994 "name": "pt4", 00:18:53.994 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:53.994 "is_configured": true, 00:18:53.994 "data_offset": 2048, 00:18:53.994 "data_size": 63488 00:18:53.994 } 00:18:53.994 ] 00:18:53.994 } 00:18:53.994 } 00:18:53.994 }' 00:18:53.994 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:53.994 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:53.994 pt2 00:18:53.994 pt3 00:18:53.994 pt4' 00:18:53.994 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.994 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:53.994 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:54.565 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.565 "name": "pt1", 00:18:54.565 "aliases": [ 00:18:54.565 "00000000-0000-0000-0000-000000000001" 00:18:54.565 ], 00:18:54.565 "product_name": "passthru", 00:18:54.565 "block_size": 512, 00:18:54.565 "num_blocks": 65536, 00:18:54.565 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:54.565 "assigned_rate_limits": { 00:18:54.565 "rw_ios_per_sec": 0, 00:18:54.565 "rw_mbytes_per_sec": 0, 00:18:54.565 "r_mbytes_per_sec": 0, 00:18:54.565 "w_mbytes_per_sec": 0 00:18:54.565 }, 00:18:54.565 "claimed": true, 00:18:54.565 "claim_type": "exclusive_write", 00:18:54.565 "zoned": false, 00:18:54.565 "supported_io_types": { 00:18:54.565 "read": true, 00:18:54.565 "write": true, 00:18:54.565 "unmap": true, 00:18:54.565 "flush": true, 00:18:54.565 "reset": true, 00:18:54.565 "nvme_admin": false, 00:18:54.565 "nvme_io": false, 00:18:54.565 "nvme_io_md": false, 00:18:54.565 "write_zeroes": true, 00:18:54.565 "zcopy": true, 00:18:54.565 "get_zone_info": false, 00:18:54.565 "zone_management": false, 00:18:54.565 "zone_append": false, 00:18:54.565 "compare": false, 00:18:54.565 "compare_and_write": false, 00:18:54.565 "abort": true, 00:18:54.565 "seek_hole": false, 00:18:54.565 "seek_data": false, 00:18:54.565 "copy": true, 00:18:54.565 "nvme_iov_md": false 00:18:54.565 }, 00:18:54.565 "memory_domains": [ 00:18:54.565 { 00:18:54.565 "dma_device_id": "system", 00:18:54.565 "dma_device_type": 1 00:18:54.565 }, 00:18:54.565 { 00:18:54.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.565 "dma_device_type": 2 00:18:54.565 } 00:18:54.565 ], 00:18:54.565 "driver_specific": { 00:18:54.565 "passthru": { 00:18:54.565 "name": "pt1", 00:18:54.565 "base_bdev_name": "malloc1" 00:18:54.565 } 00:18:54.565 } 00:18:54.565 }' 00:18:54.565 13:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.565 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.565 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:54.565 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.826 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.826 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:54.826 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.826 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.826 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.826 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.826 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.084 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.084 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:55.084 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:55.084 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:55.342 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:55.342 "name": "pt2", 00:18:55.342 "aliases": [ 00:18:55.342 "00000000-0000-0000-0000-000000000002" 00:18:55.342 ], 00:18:55.342 "product_name": "passthru", 00:18:55.342 "block_size": 512, 00:18:55.342 "num_blocks": 65536, 00:18:55.342 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:55.342 "assigned_rate_limits": { 00:18:55.342 "rw_ios_per_sec": 0, 00:18:55.342 "rw_mbytes_per_sec": 0, 00:18:55.342 "r_mbytes_per_sec": 0, 00:18:55.342 "w_mbytes_per_sec": 0 00:18:55.342 }, 00:18:55.342 "claimed": true, 00:18:55.342 "claim_type": "exclusive_write", 00:18:55.342 "zoned": false, 00:18:55.342 "supported_io_types": { 00:18:55.342 "read": true, 00:18:55.342 "write": true, 00:18:55.342 "unmap": true, 00:18:55.342 "flush": true, 00:18:55.342 "reset": true, 00:18:55.342 "nvme_admin": false, 00:18:55.342 "nvme_io": false, 00:18:55.342 "nvme_io_md": false, 00:18:55.342 "write_zeroes": true, 00:18:55.342 "zcopy": true, 00:18:55.342 "get_zone_info": false, 00:18:55.342 "zone_management": false, 00:18:55.342 "zone_append": false, 00:18:55.342 "compare": false, 00:18:55.342 "compare_and_write": false, 00:18:55.342 "abort": true, 00:18:55.342 "seek_hole": false, 00:18:55.342 "seek_data": false, 00:18:55.342 "copy": true, 00:18:55.342 "nvme_iov_md": false 00:18:55.342 }, 00:18:55.342 "memory_domains": [ 00:18:55.342 { 00:18:55.342 "dma_device_id": "system", 00:18:55.342 "dma_device_type": 1 00:18:55.342 }, 00:18:55.342 { 00:18:55.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.342 "dma_device_type": 2 00:18:55.342 } 00:18:55.342 ], 00:18:55.342 "driver_specific": { 00:18:55.342 "passthru": { 00:18:55.342 "name": "pt2", 00:18:55.342 "base_bdev_name": "malloc2" 00:18:55.342 } 00:18:55.342 } 00:18:55.342 }' 00:18:55.342 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.342 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.342 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:55.342 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.342 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.342 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.342 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.600 13:44:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.600 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.600 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.600 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.600 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.600 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:55.600 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:55.600 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:56.168 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:56.168 "name": "pt3", 00:18:56.168 "aliases": [ 00:18:56.168 "00000000-0000-0000-0000-000000000003" 00:18:56.168 ], 00:18:56.168 "product_name": "passthru", 00:18:56.168 "block_size": 512, 00:18:56.168 "num_blocks": 65536, 00:18:56.168 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:56.168 "assigned_rate_limits": { 00:18:56.168 "rw_ios_per_sec": 0, 00:18:56.168 "rw_mbytes_per_sec": 0, 00:18:56.168 "r_mbytes_per_sec": 0, 00:18:56.168 "w_mbytes_per_sec": 0 00:18:56.168 }, 00:18:56.168 "claimed": true, 00:18:56.168 "claim_type": "exclusive_write", 00:18:56.168 "zoned": false, 00:18:56.168 "supported_io_types": { 00:18:56.168 "read": true, 00:18:56.168 "write": true, 00:18:56.168 "unmap": true, 00:18:56.168 "flush": true, 00:18:56.168 "reset": true, 00:18:56.168 "nvme_admin": false, 00:18:56.168 "nvme_io": false, 00:18:56.168 "nvme_io_md": false, 00:18:56.168 "write_zeroes": true, 00:18:56.168 "zcopy": true, 00:18:56.168 "get_zone_info": false, 00:18:56.168 "zone_management": false, 00:18:56.168 "zone_append": false, 00:18:56.168 "compare": false, 00:18:56.168 "compare_and_write": false, 00:18:56.168 "abort": true, 00:18:56.168 "seek_hole": false, 00:18:56.168 "seek_data": false, 00:18:56.168 "copy": true, 00:18:56.168 "nvme_iov_md": false 00:18:56.168 }, 00:18:56.168 "memory_domains": [ 00:18:56.168 { 00:18:56.168 "dma_device_id": "system", 00:18:56.168 "dma_device_type": 1 00:18:56.168 }, 00:18:56.168 { 00:18:56.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.169 "dma_device_type": 2 00:18:56.169 } 00:18:56.169 ], 00:18:56.169 "driver_specific": { 00:18:56.169 "passthru": { 00:18:56.169 "name": "pt3", 00:18:56.169 "base_bdev_name": "malloc3" 00:18:56.169 } 00:18:56.169 } 00:18:56.169 }' 00:18:56.169 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.427 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.427 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:56.427 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.427 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.427 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:56.427 13:44:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:56.686 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:56.686 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:56.686 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.686 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.686 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:56.686 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:56.686 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:56.686 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:56.945 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:56.945 "name": "pt4", 00:18:56.945 "aliases": [ 00:18:56.945 "00000000-0000-0000-0000-000000000004" 00:18:56.945 ], 00:18:56.945 "product_name": "passthru", 00:18:56.945 "block_size": 512, 00:18:56.945 "num_blocks": 65536, 00:18:56.945 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:56.945 "assigned_rate_limits": { 00:18:56.945 "rw_ios_per_sec": 0, 00:18:56.945 "rw_mbytes_per_sec": 0, 00:18:56.945 "r_mbytes_per_sec": 0, 00:18:56.945 "w_mbytes_per_sec": 0 00:18:56.945 }, 00:18:56.945 "claimed": true, 00:18:56.945 "claim_type": "exclusive_write", 00:18:56.945 "zoned": false, 00:18:56.945 "supported_io_types": { 00:18:56.945 "read": true, 00:18:56.945 "write": true, 00:18:56.945 "unmap": true, 00:18:56.945 "flush": true, 00:18:56.945 "reset": true, 00:18:56.945 "nvme_admin": false, 00:18:56.945 "nvme_io": false, 00:18:56.945 "nvme_io_md": false, 00:18:56.945 "write_zeroes": true, 00:18:56.945 "zcopy": true, 00:18:56.945 "get_zone_info": false, 00:18:56.945 "zone_management": false, 00:18:56.945 "zone_append": false, 00:18:56.945 "compare": false, 00:18:56.945 "compare_and_write": false, 00:18:56.945 "abort": true, 00:18:56.945 "seek_hole": false, 00:18:56.945 "seek_data": false, 00:18:56.945 "copy": true, 00:18:56.945 "nvme_iov_md": false 00:18:56.945 }, 00:18:56.945 "memory_domains": [ 00:18:56.945 { 00:18:56.945 "dma_device_id": "system", 00:18:56.945 "dma_device_type": 1 00:18:56.945 }, 00:18:56.945 { 00:18:56.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.945 "dma_device_type": 2 00:18:56.945 } 00:18:56.945 ], 00:18:56.945 "driver_specific": { 00:18:56.945 "passthru": { 00:18:56.945 "name": "pt4", 00:18:56.945 "base_bdev_name": "malloc4" 00:18:56.945 } 00:18:56.945 } 00:18:56.945 }' 00:18:56.945 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.945 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.945 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:56.945 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.945 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.945 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:56.945 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.205 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.205 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:57.205 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:57.205 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:57.205 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:57.205 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:57.205 13:44:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:57.774 [2024-07-12 13:44:46.247047] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 1ec1afa0-b97c-46b6-a998-206888e6b6c6 '!=' 1ec1afa0-b97c-46b6-a998-206888e6b6c6 ']' 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 500042 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 500042 ']' 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 500042 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 500042 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 500042' 00:18:57.774 killing process with pid 500042 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 500042 00:18:57.774 [2024-07-12 13:44:46.329772] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:57.774 [2024-07-12 13:44:46.329837] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:57.774 [2024-07-12 13:44:46.329904] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:57.774 [2024-07-12 13:44:46.329916] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x87a4b0 name raid_bdev1, state offline 00:18:57.774 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 500042 00:18:58.034 [2024-07-12 13:44:46.366180] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:58.034 13:44:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:58.034 00:18:58.034 real 0m17.658s 00:18:58.034 user 0m32.037s 00:18:58.034 sys 0m3.011s 00:18:58.034 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:58.034 13:44:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.034 ************************************ 00:18:58.034 END TEST raid_superblock_test 00:18:58.034 ************************************ 00:18:58.294 13:44:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:58.294 13:44:46 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:18:58.294 13:44:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:58.294 13:44:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:58.294 13:44:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:58.294 ************************************ 00:18:58.294 START TEST raid_read_error_test 00:18:58.294 ************************************ 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.7S9ElNUFkG 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=502680 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 502680 /var/tmp/spdk-raid.sock 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 502680 ']' 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:58.294 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:58.294 13:44:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.294 [2024-07-12 13:44:46.734196] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:18:58.294 [2024-07-12 13:44:46.734263] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid502680 ] 00:18:58.294 [2024-07-12 13:44:46.864096] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:58.554 [2024-07-12 13:44:46.971601] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:58.554 [2024-07-12 13:44:47.039165] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:58.554 [2024-07-12 13:44:47.039207] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:59.122 13:44:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:59.122 13:44:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:59.122 13:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:59.122 13:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:59.381 BaseBdev1_malloc 00:18:59.381 13:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:59.639 true 00:18:59.639 13:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:59.897 [2024-07-12 13:44:48.383397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:59.897 [2024-07-12 13:44:48.383443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:59.897 [2024-07-12 13:44:48.383462] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe75a10 00:18:59.897 [2024-07-12 13:44:48.383474] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:59.897 [2024-07-12 13:44:48.385298] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:59.897 [2024-07-12 13:44:48.385325] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:59.897 BaseBdev1 00:18:59.897 13:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:59.897 13:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:00.155 BaseBdev2_malloc 00:19:00.155 13:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:00.414 true 00:19:00.414 13:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:00.672 [2024-07-12 13:44:49.145995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:00.672 [2024-07-12 13:44:49.146036] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:00.672 [2024-07-12 13:44:49.146056] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe7a250 00:19:00.672 [2024-07-12 13:44:49.146069] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:00.672 [2024-07-12 13:44:49.147492] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:00.672 [2024-07-12 13:44:49.147518] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:00.672 BaseBdev2 00:19:00.672 13:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:00.672 13:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:00.930 BaseBdev3_malloc 00:19:00.930 13:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:01.189 true 00:19:01.448 13:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:01.706 [2024-07-12 13:44:50.254757] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:01.706 [2024-07-12 13:44:50.254802] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:01.706 [2024-07-12 13:44:50.254822] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe7c510 00:19:01.706 [2024-07-12 13:44:50.254834] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:01.706 [2024-07-12 13:44:50.256460] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:01.706 [2024-07-12 13:44:50.256487] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:01.706 BaseBdev3 00:19:01.706 13:44:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:01.706 13:44:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:01.965 BaseBdev4_malloc 00:19:01.965 13:44:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:02.224 true 00:19:02.224 13:44:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:02.483 [2024-07-12 13:44:51.013382] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:02.483 [2024-07-12 13:44:51.013428] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:02.483 [2024-07-12 13:44:51.013449] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe7d3e0 00:19:02.483 [2024-07-12 13:44:51.013462] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:02.483 [2024-07-12 13:44:51.015074] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:02.483 [2024-07-12 13:44:51.015102] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:02.483 BaseBdev4 00:19:02.483 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:02.741 [2024-07-12 13:44:51.254051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:02.741 [2024-07-12 13:44:51.255442] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:02.741 [2024-07-12 13:44:51.255511] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:02.741 [2024-07-12 13:44:51.255572] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:02.741 [2024-07-12 13:44:51.255801] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe77560 00:19:02.741 [2024-07-12 13:44:51.255813] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:02.741 [2024-07-12 13:44:51.256018] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xccbba0 00:19:02.741 [2024-07-12 13:44:51.256166] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe77560 00:19:02.741 [2024-07-12 13:44:51.256176] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe77560 00:19:02.741 [2024-07-12 13:44:51.256278] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:02.741 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:02.741 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:02.741 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:02.742 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:02.742 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:02.742 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:02.742 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.742 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.742 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.742 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.742 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.742 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:03.000 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.000 "name": "raid_bdev1", 00:19:03.000 "uuid": "9cc71cf1-d9fc-40c6-9fa7-e5bf5f17cffd", 00:19:03.000 "strip_size_kb": 64, 00:19:03.000 "state": "online", 00:19:03.000 "raid_level": "raid0", 00:19:03.000 "superblock": true, 00:19:03.000 "num_base_bdevs": 4, 00:19:03.000 "num_base_bdevs_discovered": 4, 00:19:03.000 "num_base_bdevs_operational": 4, 00:19:03.000 "base_bdevs_list": [ 00:19:03.000 { 00:19:03.000 "name": "BaseBdev1", 00:19:03.000 "uuid": "2748f01f-7e6b-5df7-b085-2d957a39be08", 00:19:03.000 "is_configured": true, 00:19:03.000 "data_offset": 2048, 00:19:03.000 "data_size": 63488 00:19:03.000 }, 00:19:03.000 { 00:19:03.000 "name": "BaseBdev2", 00:19:03.000 "uuid": "f3f44069-8523-50db-a5c5-5fa8ec095bc6", 00:19:03.000 "is_configured": true, 00:19:03.000 "data_offset": 2048, 00:19:03.000 "data_size": 63488 00:19:03.000 }, 00:19:03.000 { 00:19:03.000 "name": "BaseBdev3", 00:19:03.000 "uuid": "01fff448-7735-5a38-8835-469e4db6607b", 00:19:03.000 "is_configured": true, 00:19:03.000 "data_offset": 2048, 00:19:03.000 "data_size": 63488 00:19:03.000 }, 00:19:03.000 { 00:19:03.000 "name": "BaseBdev4", 00:19:03.000 "uuid": "e9a42e71-1e77-5e1d-870b-80dcd31fd3e4", 00:19:03.000 "is_configured": true, 00:19:03.000 "data_offset": 2048, 00:19:03.000 "data_size": 63488 00:19:03.000 } 00:19:03.000 ] 00:19:03.000 }' 00:19:03.000 13:44:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.000 13:44:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:03.936 13:44:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:03.936 13:44:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:03.936 [2024-07-12 13:44:52.465559] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe69900 00:19:04.871 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:05.130 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:05.130 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:05.130 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.131 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:05.390 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.390 "name": "raid_bdev1", 00:19:05.390 "uuid": "9cc71cf1-d9fc-40c6-9fa7-e5bf5f17cffd", 00:19:05.390 "strip_size_kb": 64, 00:19:05.390 "state": "online", 00:19:05.390 "raid_level": "raid0", 00:19:05.390 "superblock": true, 00:19:05.390 "num_base_bdevs": 4, 00:19:05.390 "num_base_bdevs_discovered": 4, 00:19:05.390 "num_base_bdevs_operational": 4, 00:19:05.390 "base_bdevs_list": [ 00:19:05.390 { 00:19:05.390 "name": "BaseBdev1", 00:19:05.390 "uuid": "2748f01f-7e6b-5df7-b085-2d957a39be08", 00:19:05.390 "is_configured": true, 00:19:05.390 "data_offset": 2048, 00:19:05.390 "data_size": 63488 00:19:05.390 }, 00:19:05.390 { 00:19:05.390 "name": "BaseBdev2", 00:19:05.390 "uuid": "f3f44069-8523-50db-a5c5-5fa8ec095bc6", 00:19:05.390 "is_configured": true, 00:19:05.390 "data_offset": 2048, 00:19:05.390 "data_size": 63488 00:19:05.390 }, 00:19:05.390 { 00:19:05.390 "name": "BaseBdev3", 00:19:05.390 "uuid": "01fff448-7735-5a38-8835-469e4db6607b", 00:19:05.390 "is_configured": true, 00:19:05.390 "data_offset": 2048, 00:19:05.390 "data_size": 63488 00:19:05.390 }, 00:19:05.390 { 00:19:05.390 "name": "BaseBdev4", 00:19:05.390 "uuid": "e9a42e71-1e77-5e1d-870b-80dcd31fd3e4", 00:19:05.390 "is_configured": true, 00:19:05.390 "data_offset": 2048, 00:19:05.390 "data_size": 63488 00:19:05.390 } 00:19:05.390 ] 00:19:05.390 }' 00:19:05.390 13:44:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.390 13:44:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.326 13:44:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:06.586 [2024-07-12 13:44:54.963017] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:06.586 [2024-07-12 13:44:54.963059] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:06.586 [2024-07-12 13:44:54.966219] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:06.586 [2024-07-12 13:44:54.966257] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:06.586 [2024-07-12 13:44:54.966297] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:06.586 [2024-07-12 13:44:54.966308] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe77560 name raid_bdev1, state offline 00:19:06.586 0 00:19:06.586 13:44:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 502680 00:19:06.586 13:44:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 502680 ']' 00:19:06.586 13:44:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 502680 00:19:06.586 13:44:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:06.586 13:44:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:06.586 13:44:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 502680 00:19:06.586 13:44:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:06.586 13:44:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:06.586 13:44:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 502680' 00:19:06.586 killing process with pid 502680 00:19:06.586 13:44:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 502680 00:19:06.586 [2024-07-12 13:44:55.046293] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:06.586 13:44:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 502680 00:19:06.586 [2024-07-12 13:44:55.078255] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:06.845 13:44:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.7S9ElNUFkG 00:19:06.845 13:44:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:06.846 13:44:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:06.846 13:44:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:19:06.846 13:44:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:06.846 13:44:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:06.846 13:44:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:06.846 13:44:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:19:06.846 00:19:06.846 real 0m8.658s 00:19:06.846 user 0m14.164s 00:19:06.846 sys 0m1.429s 00:19:06.846 13:44:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:06.846 13:44:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.846 ************************************ 00:19:06.846 END TEST raid_read_error_test 00:19:06.846 ************************************ 00:19:06.846 13:44:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:06.846 13:44:55 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:19:06.846 13:44:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:06.846 13:44:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:06.846 13:44:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:06.846 ************************************ 00:19:06.846 START TEST raid_write_error_test 00:19:06.846 ************************************ 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.vLWipBYZR2 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=503843 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 503843 /var/tmp/spdk-raid.sock 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 503843 ']' 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:06.846 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:06.846 13:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:07.105 [2024-07-12 13:44:55.520975] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:19:07.105 [2024-07-12 13:44:55.521108] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid503843 ] 00:19:07.365 [2024-07-12 13:44:55.715576] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:07.365 [2024-07-12 13:44:55.818658] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:07.365 [2024-07-12 13:44:55.878409] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:07.365 [2024-07-12 13:44:55.878436] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:07.365 13:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:07.365 13:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:07.365 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:07.365 13:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:07.933 BaseBdev1_malloc 00:19:07.933 13:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:08.193 true 00:19:08.193 13:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:08.761 [2024-07-12 13:44:57.238147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:08.761 [2024-07-12 13:44:57.238192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:08.761 [2024-07-12 13:44:57.238213] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cc5a10 00:19:08.761 [2024-07-12 13:44:57.238226] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:08.761 [2024-07-12 13:44:57.240097] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:08.762 [2024-07-12 13:44:57.240127] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:08.762 BaseBdev1 00:19:08.762 13:44:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:08.762 13:44:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:09.021 BaseBdev2_malloc 00:19:09.021 13:44:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:09.589 true 00:19:09.589 13:44:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:09.847 [2024-07-12 13:44:58.349685] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:09.848 [2024-07-12 13:44:58.349727] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:09.848 [2024-07-12 13:44:58.349748] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cca250 00:19:09.848 [2024-07-12 13:44:58.349761] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:09.848 [2024-07-12 13:44:58.351379] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:09.848 [2024-07-12 13:44:58.351407] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:09.848 BaseBdev2 00:19:09.848 13:44:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:09.848 13:44:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:10.414 BaseBdev3_malloc 00:19:10.414 13:44:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:10.673 true 00:19:10.673 13:44:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:11.240 [2024-07-12 13:44:59.658870] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:11.240 [2024-07-12 13:44:59.658916] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:11.240 [2024-07-12 13:44:59.658944] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ccc510 00:19:11.240 [2024-07-12 13:44:59.658957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:11.240 [2024-07-12 13:44:59.660537] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:11.240 [2024-07-12 13:44:59.660565] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:11.240 BaseBdev3 00:19:11.240 13:44:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:11.240 13:44:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:11.499 BaseBdev4_malloc 00:19:11.499 13:44:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:11.758 true 00:19:11.758 13:45:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:12.016 [2024-07-12 13:45:00.518965] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:12.016 [2024-07-12 13:45:00.519012] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:12.016 [2024-07-12 13:45:00.519033] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ccd3e0 00:19:12.016 [2024-07-12 13:45:00.519046] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:12.016 [2024-07-12 13:45:00.520670] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:12.016 [2024-07-12 13:45:00.520699] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:12.016 BaseBdev4 00:19:12.016 13:45:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:12.581 [2024-07-12 13:45:01.020298] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:12.581 [2024-07-12 13:45:01.021643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:12.581 [2024-07-12 13:45:01.021711] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:12.581 [2024-07-12 13:45:01.021772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:12.581 [2024-07-12 13:45:01.022007] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cc7560 00:19:12.581 [2024-07-12 13:45:01.022019] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:12.581 [2024-07-12 13:45:01.022214] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b1bba0 00:19:12.581 [2024-07-12 13:45:01.022367] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cc7560 00:19:12.581 [2024-07-12 13:45:01.022378] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cc7560 00:19:12.581 [2024-07-12 13:45:01.022483] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.581 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:13.149 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:13.149 "name": "raid_bdev1", 00:19:13.149 "uuid": "6c685175-0276-4cff-98a3-a94ceacd72fe", 00:19:13.149 "strip_size_kb": 64, 00:19:13.149 "state": "online", 00:19:13.149 "raid_level": "raid0", 00:19:13.149 "superblock": true, 00:19:13.149 "num_base_bdevs": 4, 00:19:13.149 "num_base_bdevs_discovered": 4, 00:19:13.149 "num_base_bdevs_operational": 4, 00:19:13.149 "base_bdevs_list": [ 00:19:13.149 { 00:19:13.149 "name": "BaseBdev1", 00:19:13.149 "uuid": "cf0a9b97-e7a3-5cd9-ab30-6ba193242687", 00:19:13.149 "is_configured": true, 00:19:13.149 "data_offset": 2048, 00:19:13.149 "data_size": 63488 00:19:13.149 }, 00:19:13.149 { 00:19:13.149 "name": "BaseBdev2", 00:19:13.149 "uuid": "8b903c0b-a558-5f4e-985d-358f06330603", 00:19:13.149 "is_configured": true, 00:19:13.149 "data_offset": 2048, 00:19:13.149 "data_size": 63488 00:19:13.149 }, 00:19:13.149 { 00:19:13.149 "name": "BaseBdev3", 00:19:13.149 "uuid": "cfbe83c8-a575-5fea-b8c3-8d57c46e0363", 00:19:13.149 "is_configured": true, 00:19:13.149 "data_offset": 2048, 00:19:13.149 "data_size": 63488 00:19:13.149 }, 00:19:13.149 { 00:19:13.149 "name": "BaseBdev4", 00:19:13.149 "uuid": "31962740-0f31-5502-bec4-8754b161092c", 00:19:13.149 "is_configured": true, 00:19:13.149 "data_offset": 2048, 00:19:13.149 "data_size": 63488 00:19:13.149 } 00:19:13.149 ] 00:19:13.149 }' 00:19:13.149 13:45:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:13.149 13:45:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.088 13:45:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:14.089 13:45:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:14.089 [2024-07-12 13:45:02.548643] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cb9900 00:19:15.041 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.301 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.561 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.561 "name": "raid_bdev1", 00:19:15.561 "uuid": "6c685175-0276-4cff-98a3-a94ceacd72fe", 00:19:15.561 "strip_size_kb": 64, 00:19:15.561 "state": "online", 00:19:15.561 "raid_level": "raid0", 00:19:15.561 "superblock": true, 00:19:15.561 "num_base_bdevs": 4, 00:19:15.561 "num_base_bdevs_discovered": 4, 00:19:15.561 "num_base_bdevs_operational": 4, 00:19:15.561 "base_bdevs_list": [ 00:19:15.561 { 00:19:15.561 "name": "BaseBdev1", 00:19:15.561 "uuid": "cf0a9b97-e7a3-5cd9-ab30-6ba193242687", 00:19:15.561 "is_configured": true, 00:19:15.561 "data_offset": 2048, 00:19:15.561 "data_size": 63488 00:19:15.561 }, 00:19:15.561 { 00:19:15.561 "name": "BaseBdev2", 00:19:15.561 "uuid": "8b903c0b-a558-5f4e-985d-358f06330603", 00:19:15.561 "is_configured": true, 00:19:15.561 "data_offset": 2048, 00:19:15.561 "data_size": 63488 00:19:15.561 }, 00:19:15.561 { 00:19:15.561 "name": "BaseBdev3", 00:19:15.561 "uuid": "cfbe83c8-a575-5fea-b8c3-8d57c46e0363", 00:19:15.561 "is_configured": true, 00:19:15.561 "data_offset": 2048, 00:19:15.561 "data_size": 63488 00:19:15.561 }, 00:19:15.561 { 00:19:15.561 "name": "BaseBdev4", 00:19:15.561 "uuid": "31962740-0f31-5502-bec4-8754b161092c", 00:19:15.561 "is_configured": true, 00:19:15.561 "data_offset": 2048, 00:19:15.561 "data_size": 63488 00:19:15.561 } 00:19:15.561 ] 00:19:15.561 }' 00:19:15.561 13:45:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.561 13:45:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.498 13:45:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:16.498 [2024-07-12 13:45:05.053880] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:16.498 [2024-07-12 13:45:05.053919] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:16.498 [2024-07-12 13:45:05.057084] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:16.498 [2024-07-12 13:45:05.057121] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:16.498 [2024-07-12 13:45:05.057161] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:16.498 [2024-07-12 13:45:05.057172] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cc7560 name raid_bdev1, state offline 00:19:16.498 0 00:19:16.757 13:45:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 503843 00:19:16.758 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 503843 ']' 00:19:16.758 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 503843 00:19:16.758 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:16.758 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:16.758 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 503843 00:19:16.758 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:16.758 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:16.758 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 503843' 00:19:16.758 killing process with pid 503843 00:19:16.758 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 503843 00:19:16.758 [2024-07-12 13:45:05.140357] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:16.758 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 503843 00:19:16.758 [2024-07-12 13:45:05.171505] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:17.018 13:45:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.vLWipBYZR2 00:19:17.018 13:45:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:17.018 13:45:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:17.018 13:45:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:19:17.018 13:45:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:17.018 13:45:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:17.018 13:45:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:17.018 13:45:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:19:17.018 00:19:17.018 real 0m10.009s 00:19:17.018 user 0m17.068s 00:19:17.018 sys 0m1.695s 00:19:17.018 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:17.018 13:45:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.018 ************************************ 00:19:17.018 END TEST raid_write_error_test 00:19:17.018 ************************************ 00:19:17.018 13:45:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:17.018 13:45:05 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:17.018 13:45:05 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:19:17.018 13:45:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:17.018 13:45:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:17.018 13:45:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:17.018 ************************************ 00:19:17.018 START TEST raid_state_function_test 00:19:17.018 ************************************ 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=505650 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 505650' 00:19:17.018 Process raid pid: 505650 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 505650 /var/tmp/spdk-raid.sock 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 505650 ']' 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:17.018 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:17.018 13:45:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.018 [2024-07-12 13:45:05.567220] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:19:17.018 [2024-07-12 13:45:05.567293] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:17.278 [2024-07-12 13:45:05.699604] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:17.278 [2024-07-12 13:45:05.800698] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:17.538 [2024-07-12 13:45:05.863406] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:17.538 [2024-07-12 13:45:05.863435] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:18.105 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:18.105 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:18.105 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:18.365 [2024-07-12 13:45:06.726259] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:18.365 [2024-07-12 13:45:06.726303] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:18.365 [2024-07-12 13:45:06.726314] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:18.365 [2024-07-12 13:45:06.726326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:18.365 [2024-07-12 13:45:06.726335] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:18.365 [2024-07-12 13:45:06.726346] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:18.365 [2024-07-12 13:45:06.726355] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:18.365 [2024-07-12 13:45:06.726366] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.365 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.624 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.624 "name": "Existed_Raid", 00:19:18.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.624 "strip_size_kb": 64, 00:19:18.624 "state": "configuring", 00:19:18.624 "raid_level": "concat", 00:19:18.624 "superblock": false, 00:19:18.624 "num_base_bdevs": 4, 00:19:18.624 "num_base_bdevs_discovered": 0, 00:19:18.624 "num_base_bdevs_operational": 4, 00:19:18.624 "base_bdevs_list": [ 00:19:18.624 { 00:19:18.624 "name": "BaseBdev1", 00:19:18.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.624 "is_configured": false, 00:19:18.624 "data_offset": 0, 00:19:18.624 "data_size": 0 00:19:18.624 }, 00:19:18.624 { 00:19:18.624 "name": "BaseBdev2", 00:19:18.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.624 "is_configured": false, 00:19:18.624 "data_offset": 0, 00:19:18.624 "data_size": 0 00:19:18.624 }, 00:19:18.624 { 00:19:18.624 "name": "BaseBdev3", 00:19:18.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.624 "is_configured": false, 00:19:18.624 "data_offset": 0, 00:19:18.624 "data_size": 0 00:19:18.624 }, 00:19:18.624 { 00:19:18.624 "name": "BaseBdev4", 00:19:18.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.624 "is_configured": false, 00:19:18.624 "data_offset": 0, 00:19:18.624 "data_size": 0 00:19:18.624 } 00:19:18.624 ] 00:19:18.624 }' 00:19:18.624 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.624 13:45:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.192 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:19.450 [2024-07-12 13:45:07.845068] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:19.450 [2024-07-12 13:45:07.845099] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19eb370 name Existed_Raid, state configuring 00:19:19.450 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:19.709 [2024-07-12 13:45:08.089743] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:19.709 [2024-07-12 13:45:08.089774] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:19.709 [2024-07-12 13:45:08.089785] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:19.709 [2024-07-12 13:45:08.089797] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:19.709 [2024-07-12 13:45:08.089805] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:19.709 [2024-07-12 13:45:08.089818] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:19.709 [2024-07-12 13:45:08.089827] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:19.709 [2024-07-12 13:45:08.089840] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:19.709 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:20.278 [2024-07-12 13:45:08.608876] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:20.278 BaseBdev1 00:19:20.278 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:20.278 13:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:20.278 13:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:20.278 13:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:20.278 13:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:20.278 13:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:20.278 13:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:20.538 13:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:20.538 [ 00:19:20.538 { 00:19:20.538 "name": "BaseBdev1", 00:19:20.538 "aliases": [ 00:19:20.538 "c033485f-d4ea-43ae-b74a-ca5a0ec35f3e" 00:19:20.538 ], 00:19:20.538 "product_name": "Malloc disk", 00:19:20.538 "block_size": 512, 00:19:20.538 "num_blocks": 65536, 00:19:20.538 "uuid": "c033485f-d4ea-43ae-b74a-ca5a0ec35f3e", 00:19:20.538 "assigned_rate_limits": { 00:19:20.538 "rw_ios_per_sec": 0, 00:19:20.538 "rw_mbytes_per_sec": 0, 00:19:20.538 "r_mbytes_per_sec": 0, 00:19:20.538 "w_mbytes_per_sec": 0 00:19:20.538 }, 00:19:20.538 "claimed": true, 00:19:20.538 "claim_type": "exclusive_write", 00:19:20.538 "zoned": false, 00:19:20.538 "supported_io_types": { 00:19:20.538 "read": true, 00:19:20.538 "write": true, 00:19:20.538 "unmap": true, 00:19:20.538 "flush": true, 00:19:20.538 "reset": true, 00:19:20.538 "nvme_admin": false, 00:19:20.538 "nvme_io": false, 00:19:20.538 "nvme_io_md": false, 00:19:20.538 "write_zeroes": true, 00:19:20.538 "zcopy": true, 00:19:20.538 "get_zone_info": false, 00:19:20.538 "zone_management": false, 00:19:20.538 "zone_append": false, 00:19:20.538 "compare": false, 00:19:20.538 "compare_and_write": false, 00:19:20.538 "abort": true, 00:19:20.538 "seek_hole": false, 00:19:20.538 "seek_data": false, 00:19:20.538 "copy": true, 00:19:20.538 "nvme_iov_md": false 00:19:20.538 }, 00:19:20.538 "memory_domains": [ 00:19:20.538 { 00:19:20.538 "dma_device_id": "system", 00:19:20.538 "dma_device_type": 1 00:19:20.538 }, 00:19:20.538 { 00:19:20.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.538 "dma_device_type": 2 00:19:20.538 } 00:19:20.538 ], 00:19:20.538 "driver_specific": {} 00:19:20.538 } 00:19:20.538 ] 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.797 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.798 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:21.057 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.057 "name": "Existed_Raid", 00:19:21.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.057 "strip_size_kb": 64, 00:19:21.057 "state": "configuring", 00:19:21.057 "raid_level": "concat", 00:19:21.057 "superblock": false, 00:19:21.057 "num_base_bdevs": 4, 00:19:21.057 "num_base_bdevs_discovered": 1, 00:19:21.057 "num_base_bdevs_operational": 4, 00:19:21.057 "base_bdevs_list": [ 00:19:21.057 { 00:19:21.057 "name": "BaseBdev1", 00:19:21.057 "uuid": "c033485f-d4ea-43ae-b74a-ca5a0ec35f3e", 00:19:21.057 "is_configured": true, 00:19:21.057 "data_offset": 0, 00:19:21.057 "data_size": 65536 00:19:21.057 }, 00:19:21.057 { 00:19:21.057 "name": "BaseBdev2", 00:19:21.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.057 "is_configured": false, 00:19:21.057 "data_offset": 0, 00:19:21.057 "data_size": 0 00:19:21.057 }, 00:19:21.057 { 00:19:21.057 "name": "BaseBdev3", 00:19:21.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.057 "is_configured": false, 00:19:21.057 "data_offset": 0, 00:19:21.057 "data_size": 0 00:19:21.057 }, 00:19:21.057 { 00:19:21.057 "name": "BaseBdev4", 00:19:21.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:21.057 "is_configured": false, 00:19:21.057 "data_offset": 0, 00:19:21.057 "data_size": 0 00:19:21.057 } 00:19:21.057 ] 00:19:21.057 }' 00:19:21.057 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.057 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:21.624 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:21.886 [2024-07-12 13:45:10.229194] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:21.886 [2024-07-12 13:45:10.229238] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19eabe0 name Existed_Raid, state configuring 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:21.886 [2024-07-12 13:45:10.405699] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:21.886 [2024-07-12 13:45:10.407121] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:21.886 [2024-07-12 13:45:10.407154] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:21.886 [2024-07-12 13:45:10.407164] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:21.886 [2024-07-12 13:45:10.407176] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:21.886 [2024-07-12 13:45:10.407185] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:21.886 [2024-07-12 13:45:10.407196] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.886 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.887 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.145 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.145 "name": "Existed_Raid", 00:19:22.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.145 "strip_size_kb": 64, 00:19:22.145 "state": "configuring", 00:19:22.145 "raid_level": "concat", 00:19:22.145 "superblock": false, 00:19:22.145 "num_base_bdevs": 4, 00:19:22.145 "num_base_bdevs_discovered": 1, 00:19:22.145 "num_base_bdevs_operational": 4, 00:19:22.145 "base_bdevs_list": [ 00:19:22.145 { 00:19:22.145 "name": "BaseBdev1", 00:19:22.145 "uuid": "c033485f-d4ea-43ae-b74a-ca5a0ec35f3e", 00:19:22.145 "is_configured": true, 00:19:22.145 "data_offset": 0, 00:19:22.145 "data_size": 65536 00:19:22.145 }, 00:19:22.145 { 00:19:22.145 "name": "BaseBdev2", 00:19:22.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.145 "is_configured": false, 00:19:22.145 "data_offset": 0, 00:19:22.145 "data_size": 0 00:19:22.145 }, 00:19:22.145 { 00:19:22.145 "name": "BaseBdev3", 00:19:22.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.145 "is_configured": false, 00:19:22.145 "data_offset": 0, 00:19:22.145 "data_size": 0 00:19:22.145 }, 00:19:22.145 { 00:19:22.145 "name": "BaseBdev4", 00:19:22.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.145 "is_configured": false, 00:19:22.145 "data_offset": 0, 00:19:22.145 "data_size": 0 00:19:22.145 } 00:19:22.145 ] 00:19:22.145 }' 00:19:22.145 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.145 13:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:22.713 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:22.972 [2024-07-12 13:45:11.395869] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:22.972 BaseBdev2 00:19:22.972 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:22.972 13:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:22.972 13:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:22.972 13:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:22.972 13:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:22.972 13:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:22.972 13:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:23.231 [ 00:19:23.231 { 00:19:23.231 "name": "BaseBdev2", 00:19:23.231 "aliases": [ 00:19:23.231 "9b8d7d35-9527-439f-95f8-578ba7b730a3" 00:19:23.231 ], 00:19:23.231 "product_name": "Malloc disk", 00:19:23.231 "block_size": 512, 00:19:23.231 "num_blocks": 65536, 00:19:23.231 "uuid": "9b8d7d35-9527-439f-95f8-578ba7b730a3", 00:19:23.231 "assigned_rate_limits": { 00:19:23.231 "rw_ios_per_sec": 0, 00:19:23.231 "rw_mbytes_per_sec": 0, 00:19:23.231 "r_mbytes_per_sec": 0, 00:19:23.231 "w_mbytes_per_sec": 0 00:19:23.231 }, 00:19:23.231 "claimed": true, 00:19:23.231 "claim_type": "exclusive_write", 00:19:23.231 "zoned": false, 00:19:23.231 "supported_io_types": { 00:19:23.231 "read": true, 00:19:23.231 "write": true, 00:19:23.231 "unmap": true, 00:19:23.231 "flush": true, 00:19:23.231 "reset": true, 00:19:23.231 "nvme_admin": false, 00:19:23.231 "nvme_io": false, 00:19:23.231 "nvme_io_md": false, 00:19:23.231 "write_zeroes": true, 00:19:23.231 "zcopy": true, 00:19:23.231 "get_zone_info": false, 00:19:23.231 "zone_management": false, 00:19:23.231 "zone_append": false, 00:19:23.231 "compare": false, 00:19:23.231 "compare_and_write": false, 00:19:23.231 "abort": true, 00:19:23.231 "seek_hole": false, 00:19:23.231 "seek_data": false, 00:19:23.231 "copy": true, 00:19:23.231 "nvme_iov_md": false 00:19:23.231 }, 00:19:23.231 "memory_domains": [ 00:19:23.231 { 00:19:23.231 "dma_device_id": "system", 00:19:23.231 "dma_device_type": 1 00:19:23.231 }, 00:19:23.231 { 00:19:23.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.231 "dma_device_type": 2 00:19:23.231 } 00:19:23.231 ], 00:19:23.231 "driver_specific": {} 00:19:23.231 } 00:19:23.231 ] 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.231 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:23.490 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:23.490 "name": "Existed_Raid", 00:19:23.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:23.490 "strip_size_kb": 64, 00:19:23.490 "state": "configuring", 00:19:23.490 "raid_level": "concat", 00:19:23.490 "superblock": false, 00:19:23.490 "num_base_bdevs": 4, 00:19:23.490 "num_base_bdevs_discovered": 2, 00:19:23.490 "num_base_bdevs_operational": 4, 00:19:23.490 "base_bdevs_list": [ 00:19:23.490 { 00:19:23.490 "name": "BaseBdev1", 00:19:23.490 "uuid": "c033485f-d4ea-43ae-b74a-ca5a0ec35f3e", 00:19:23.490 "is_configured": true, 00:19:23.490 "data_offset": 0, 00:19:23.490 "data_size": 65536 00:19:23.490 }, 00:19:23.490 { 00:19:23.490 "name": "BaseBdev2", 00:19:23.490 "uuid": "9b8d7d35-9527-439f-95f8-578ba7b730a3", 00:19:23.490 "is_configured": true, 00:19:23.490 "data_offset": 0, 00:19:23.490 "data_size": 65536 00:19:23.490 }, 00:19:23.490 { 00:19:23.490 "name": "BaseBdev3", 00:19:23.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:23.490 "is_configured": false, 00:19:23.490 "data_offset": 0, 00:19:23.490 "data_size": 0 00:19:23.490 }, 00:19:23.490 { 00:19:23.490 "name": "BaseBdev4", 00:19:23.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:23.490 "is_configured": false, 00:19:23.490 "data_offset": 0, 00:19:23.490 "data_size": 0 00:19:23.490 } 00:19:23.490 ] 00:19:23.490 }' 00:19:23.490 13:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:23.490 13:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:24.058 13:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:24.318 [2024-07-12 13:45:12.791487] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:24.318 BaseBdev3 00:19:24.318 13:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:24.318 13:45:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:24.318 13:45:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:24.318 13:45:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:24.318 13:45:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:24.318 13:45:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:24.318 13:45:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:24.577 13:45:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:24.836 [ 00:19:24.836 { 00:19:24.836 "name": "BaseBdev3", 00:19:24.836 "aliases": [ 00:19:24.836 "b31a4d58-5781-4b6e-ac52-7142b4067fa5" 00:19:24.836 ], 00:19:24.836 "product_name": "Malloc disk", 00:19:24.836 "block_size": 512, 00:19:24.836 "num_blocks": 65536, 00:19:24.836 "uuid": "b31a4d58-5781-4b6e-ac52-7142b4067fa5", 00:19:24.836 "assigned_rate_limits": { 00:19:24.836 "rw_ios_per_sec": 0, 00:19:24.836 "rw_mbytes_per_sec": 0, 00:19:24.836 "r_mbytes_per_sec": 0, 00:19:24.836 "w_mbytes_per_sec": 0 00:19:24.836 }, 00:19:24.836 "claimed": true, 00:19:24.836 "claim_type": "exclusive_write", 00:19:24.836 "zoned": false, 00:19:24.836 "supported_io_types": { 00:19:24.836 "read": true, 00:19:24.836 "write": true, 00:19:24.836 "unmap": true, 00:19:24.836 "flush": true, 00:19:24.836 "reset": true, 00:19:24.836 "nvme_admin": false, 00:19:24.836 "nvme_io": false, 00:19:24.836 "nvme_io_md": false, 00:19:24.836 "write_zeroes": true, 00:19:24.836 "zcopy": true, 00:19:24.836 "get_zone_info": false, 00:19:24.836 "zone_management": false, 00:19:24.836 "zone_append": false, 00:19:24.836 "compare": false, 00:19:24.836 "compare_and_write": false, 00:19:24.836 "abort": true, 00:19:24.836 "seek_hole": false, 00:19:24.836 "seek_data": false, 00:19:24.836 "copy": true, 00:19:24.836 "nvme_iov_md": false 00:19:24.836 }, 00:19:24.836 "memory_domains": [ 00:19:24.836 { 00:19:24.836 "dma_device_id": "system", 00:19:24.836 "dma_device_type": 1 00:19:24.836 }, 00:19:24.836 { 00:19:24.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.836 "dma_device_type": 2 00:19:24.836 } 00:19:24.836 ], 00:19:24.836 "driver_specific": {} 00:19:24.836 } 00:19:24.836 ] 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.836 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.095 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.095 "name": "Existed_Raid", 00:19:25.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.095 "strip_size_kb": 64, 00:19:25.095 "state": "configuring", 00:19:25.095 "raid_level": "concat", 00:19:25.095 "superblock": false, 00:19:25.095 "num_base_bdevs": 4, 00:19:25.095 "num_base_bdevs_discovered": 3, 00:19:25.095 "num_base_bdevs_operational": 4, 00:19:25.095 "base_bdevs_list": [ 00:19:25.095 { 00:19:25.095 "name": "BaseBdev1", 00:19:25.095 "uuid": "c033485f-d4ea-43ae-b74a-ca5a0ec35f3e", 00:19:25.095 "is_configured": true, 00:19:25.095 "data_offset": 0, 00:19:25.095 "data_size": 65536 00:19:25.095 }, 00:19:25.095 { 00:19:25.095 "name": "BaseBdev2", 00:19:25.095 "uuid": "9b8d7d35-9527-439f-95f8-578ba7b730a3", 00:19:25.095 "is_configured": true, 00:19:25.095 "data_offset": 0, 00:19:25.095 "data_size": 65536 00:19:25.095 }, 00:19:25.095 { 00:19:25.095 "name": "BaseBdev3", 00:19:25.095 "uuid": "b31a4d58-5781-4b6e-ac52-7142b4067fa5", 00:19:25.095 "is_configured": true, 00:19:25.095 "data_offset": 0, 00:19:25.095 "data_size": 65536 00:19:25.095 }, 00:19:25.095 { 00:19:25.095 "name": "BaseBdev4", 00:19:25.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.095 "is_configured": false, 00:19:25.095 "data_offset": 0, 00:19:25.095 "data_size": 0 00:19:25.095 } 00:19:25.095 ] 00:19:25.095 }' 00:19:25.095 13:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.095 13:45:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:25.662 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:25.920 [2024-07-12 13:45:14.338974] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:25.920 [2024-07-12 13:45:14.339015] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19ebc40 00:19:25.920 [2024-07-12 13:45:14.339024] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:25.920 [2024-07-12 13:45:14.339241] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19ec8c0 00:19:25.920 [2024-07-12 13:45:14.339367] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19ebc40 00:19:25.920 [2024-07-12 13:45:14.339377] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19ebc40 00:19:25.920 [2024-07-12 13:45:14.339544] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:25.920 BaseBdev4 00:19:25.920 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:25.920 13:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:25.920 13:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:25.920 13:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:25.920 13:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:25.920 13:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:25.920 13:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:26.179 13:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:26.438 [ 00:19:26.438 { 00:19:26.438 "name": "BaseBdev4", 00:19:26.438 "aliases": [ 00:19:26.438 "128991d3-e98d-4bed-b6a7-231eac35bfba" 00:19:26.438 ], 00:19:26.438 "product_name": "Malloc disk", 00:19:26.438 "block_size": 512, 00:19:26.438 "num_blocks": 65536, 00:19:26.438 "uuid": "128991d3-e98d-4bed-b6a7-231eac35bfba", 00:19:26.438 "assigned_rate_limits": { 00:19:26.438 "rw_ios_per_sec": 0, 00:19:26.438 "rw_mbytes_per_sec": 0, 00:19:26.438 "r_mbytes_per_sec": 0, 00:19:26.438 "w_mbytes_per_sec": 0 00:19:26.438 }, 00:19:26.438 "claimed": true, 00:19:26.438 "claim_type": "exclusive_write", 00:19:26.438 "zoned": false, 00:19:26.438 "supported_io_types": { 00:19:26.438 "read": true, 00:19:26.438 "write": true, 00:19:26.438 "unmap": true, 00:19:26.438 "flush": true, 00:19:26.438 "reset": true, 00:19:26.438 "nvme_admin": false, 00:19:26.438 "nvme_io": false, 00:19:26.438 "nvme_io_md": false, 00:19:26.438 "write_zeroes": true, 00:19:26.438 "zcopy": true, 00:19:26.438 "get_zone_info": false, 00:19:26.438 "zone_management": false, 00:19:26.438 "zone_append": false, 00:19:26.438 "compare": false, 00:19:26.438 "compare_and_write": false, 00:19:26.438 "abort": true, 00:19:26.438 "seek_hole": false, 00:19:26.438 "seek_data": false, 00:19:26.438 "copy": true, 00:19:26.438 "nvme_iov_md": false 00:19:26.438 }, 00:19:26.438 "memory_domains": [ 00:19:26.438 { 00:19:26.438 "dma_device_id": "system", 00:19:26.438 "dma_device_type": 1 00:19:26.438 }, 00:19:26.438 { 00:19:26.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:26.438 "dma_device_type": 2 00:19:26.438 } 00:19:26.438 ], 00:19:26.438 "driver_specific": {} 00:19:26.438 } 00:19:26.438 ] 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.438 13:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:26.697 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:26.697 "name": "Existed_Raid", 00:19:26.697 "uuid": "d5d43ec9-1362-4f40-91f7-35c86e1a3ac0", 00:19:26.697 "strip_size_kb": 64, 00:19:26.697 "state": "online", 00:19:26.697 "raid_level": "concat", 00:19:26.697 "superblock": false, 00:19:26.697 "num_base_bdevs": 4, 00:19:26.697 "num_base_bdevs_discovered": 4, 00:19:26.697 "num_base_bdevs_operational": 4, 00:19:26.697 "base_bdevs_list": [ 00:19:26.697 { 00:19:26.697 "name": "BaseBdev1", 00:19:26.697 "uuid": "c033485f-d4ea-43ae-b74a-ca5a0ec35f3e", 00:19:26.697 "is_configured": true, 00:19:26.697 "data_offset": 0, 00:19:26.697 "data_size": 65536 00:19:26.697 }, 00:19:26.697 { 00:19:26.697 "name": "BaseBdev2", 00:19:26.697 "uuid": "9b8d7d35-9527-439f-95f8-578ba7b730a3", 00:19:26.697 "is_configured": true, 00:19:26.697 "data_offset": 0, 00:19:26.697 "data_size": 65536 00:19:26.697 }, 00:19:26.697 { 00:19:26.697 "name": "BaseBdev3", 00:19:26.697 "uuid": "b31a4d58-5781-4b6e-ac52-7142b4067fa5", 00:19:26.697 "is_configured": true, 00:19:26.697 "data_offset": 0, 00:19:26.697 "data_size": 65536 00:19:26.697 }, 00:19:26.697 { 00:19:26.697 "name": "BaseBdev4", 00:19:26.697 "uuid": "128991d3-e98d-4bed-b6a7-231eac35bfba", 00:19:26.697 "is_configured": true, 00:19:26.697 "data_offset": 0, 00:19:26.697 "data_size": 65536 00:19:26.697 } 00:19:26.697 ] 00:19:26.697 }' 00:19:26.697 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:26.697 13:45:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:27.264 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:27.264 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:27.264 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:27.264 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:27.264 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:27.264 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:27.264 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:27.264 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:27.527 [2024-07-12 13:45:15.927529] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:27.527 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:27.527 "name": "Existed_Raid", 00:19:27.527 "aliases": [ 00:19:27.527 "d5d43ec9-1362-4f40-91f7-35c86e1a3ac0" 00:19:27.527 ], 00:19:27.527 "product_name": "Raid Volume", 00:19:27.527 "block_size": 512, 00:19:27.527 "num_blocks": 262144, 00:19:27.527 "uuid": "d5d43ec9-1362-4f40-91f7-35c86e1a3ac0", 00:19:27.527 "assigned_rate_limits": { 00:19:27.527 "rw_ios_per_sec": 0, 00:19:27.527 "rw_mbytes_per_sec": 0, 00:19:27.527 "r_mbytes_per_sec": 0, 00:19:27.527 "w_mbytes_per_sec": 0 00:19:27.527 }, 00:19:27.527 "claimed": false, 00:19:27.527 "zoned": false, 00:19:27.527 "supported_io_types": { 00:19:27.527 "read": true, 00:19:27.527 "write": true, 00:19:27.527 "unmap": true, 00:19:27.527 "flush": true, 00:19:27.527 "reset": true, 00:19:27.527 "nvme_admin": false, 00:19:27.527 "nvme_io": false, 00:19:27.527 "nvme_io_md": false, 00:19:27.527 "write_zeroes": true, 00:19:27.527 "zcopy": false, 00:19:27.527 "get_zone_info": false, 00:19:27.527 "zone_management": false, 00:19:27.527 "zone_append": false, 00:19:27.527 "compare": false, 00:19:27.527 "compare_and_write": false, 00:19:27.527 "abort": false, 00:19:27.527 "seek_hole": false, 00:19:27.527 "seek_data": false, 00:19:27.527 "copy": false, 00:19:27.527 "nvme_iov_md": false 00:19:27.527 }, 00:19:27.527 "memory_domains": [ 00:19:27.527 { 00:19:27.527 "dma_device_id": "system", 00:19:27.527 "dma_device_type": 1 00:19:27.527 }, 00:19:27.527 { 00:19:27.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.527 "dma_device_type": 2 00:19:27.527 }, 00:19:27.527 { 00:19:27.527 "dma_device_id": "system", 00:19:27.527 "dma_device_type": 1 00:19:27.527 }, 00:19:27.527 { 00:19:27.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.527 "dma_device_type": 2 00:19:27.527 }, 00:19:27.527 { 00:19:27.527 "dma_device_id": "system", 00:19:27.527 "dma_device_type": 1 00:19:27.527 }, 00:19:27.527 { 00:19:27.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.527 "dma_device_type": 2 00:19:27.527 }, 00:19:27.527 { 00:19:27.527 "dma_device_id": "system", 00:19:27.527 "dma_device_type": 1 00:19:27.527 }, 00:19:27.527 { 00:19:27.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.527 "dma_device_type": 2 00:19:27.527 } 00:19:27.527 ], 00:19:27.527 "driver_specific": { 00:19:27.527 "raid": { 00:19:27.527 "uuid": "d5d43ec9-1362-4f40-91f7-35c86e1a3ac0", 00:19:27.527 "strip_size_kb": 64, 00:19:27.527 "state": "online", 00:19:27.527 "raid_level": "concat", 00:19:27.527 "superblock": false, 00:19:27.527 "num_base_bdevs": 4, 00:19:27.527 "num_base_bdevs_discovered": 4, 00:19:27.527 "num_base_bdevs_operational": 4, 00:19:27.527 "base_bdevs_list": [ 00:19:27.527 { 00:19:27.527 "name": "BaseBdev1", 00:19:27.527 "uuid": "c033485f-d4ea-43ae-b74a-ca5a0ec35f3e", 00:19:27.527 "is_configured": true, 00:19:27.527 "data_offset": 0, 00:19:27.527 "data_size": 65536 00:19:27.527 }, 00:19:27.527 { 00:19:27.527 "name": "BaseBdev2", 00:19:27.527 "uuid": "9b8d7d35-9527-439f-95f8-578ba7b730a3", 00:19:27.527 "is_configured": true, 00:19:27.527 "data_offset": 0, 00:19:27.527 "data_size": 65536 00:19:27.527 }, 00:19:27.527 { 00:19:27.527 "name": "BaseBdev3", 00:19:27.527 "uuid": "b31a4d58-5781-4b6e-ac52-7142b4067fa5", 00:19:27.527 "is_configured": true, 00:19:27.527 "data_offset": 0, 00:19:27.527 "data_size": 65536 00:19:27.527 }, 00:19:27.527 { 00:19:27.527 "name": "BaseBdev4", 00:19:27.527 "uuid": "128991d3-e98d-4bed-b6a7-231eac35bfba", 00:19:27.527 "is_configured": true, 00:19:27.527 "data_offset": 0, 00:19:27.527 "data_size": 65536 00:19:27.527 } 00:19:27.527 ] 00:19:27.527 } 00:19:27.527 } 00:19:27.527 }' 00:19:27.527 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:27.527 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:27.527 BaseBdev2 00:19:27.527 BaseBdev3 00:19:27.527 BaseBdev4' 00:19:27.527 13:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:27.527 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:27.527 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:27.833 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:27.833 "name": "BaseBdev1", 00:19:27.833 "aliases": [ 00:19:27.833 "c033485f-d4ea-43ae-b74a-ca5a0ec35f3e" 00:19:27.833 ], 00:19:27.833 "product_name": "Malloc disk", 00:19:27.833 "block_size": 512, 00:19:27.833 "num_blocks": 65536, 00:19:27.833 "uuid": "c033485f-d4ea-43ae-b74a-ca5a0ec35f3e", 00:19:27.833 "assigned_rate_limits": { 00:19:27.833 "rw_ios_per_sec": 0, 00:19:27.833 "rw_mbytes_per_sec": 0, 00:19:27.833 "r_mbytes_per_sec": 0, 00:19:27.833 "w_mbytes_per_sec": 0 00:19:27.833 }, 00:19:27.833 "claimed": true, 00:19:27.833 "claim_type": "exclusive_write", 00:19:27.833 "zoned": false, 00:19:27.833 "supported_io_types": { 00:19:27.833 "read": true, 00:19:27.833 "write": true, 00:19:27.833 "unmap": true, 00:19:27.833 "flush": true, 00:19:27.833 "reset": true, 00:19:27.833 "nvme_admin": false, 00:19:27.833 "nvme_io": false, 00:19:27.833 "nvme_io_md": false, 00:19:27.833 "write_zeroes": true, 00:19:27.833 "zcopy": true, 00:19:27.833 "get_zone_info": false, 00:19:27.833 "zone_management": false, 00:19:27.833 "zone_append": false, 00:19:27.833 "compare": false, 00:19:27.833 "compare_and_write": false, 00:19:27.833 "abort": true, 00:19:27.833 "seek_hole": false, 00:19:27.833 "seek_data": false, 00:19:27.833 "copy": true, 00:19:27.833 "nvme_iov_md": false 00:19:27.833 }, 00:19:27.833 "memory_domains": [ 00:19:27.833 { 00:19:27.833 "dma_device_id": "system", 00:19:27.833 "dma_device_type": 1 00:19:27.833 }, 00:19:27.833 { 00:19:27.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.833 "dma_device_type": 2 00:19:27.833 } 00:19:27.833 ], 00:19:27.833 "driver_specific": {} 00:19:27.833 }' 00:19:27.833 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:27.833 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:27.833 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:27.833 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:27.833 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:28.106 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:28.106 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:28.106 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:28.106 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:28.106 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:28.106 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:28.106 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:28.106 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:28.106 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:28.106 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:28.423 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:28.423 "name": "BaseBdev2", 00:19:28.423 "aliases": [ 00:19:28.423 "9b8d7d35-9527-439f-95f8-578ba7b730a3" 00:19:28.423 ], 00:19:28.423 "product_name": "Malloc disk", 00:19:28.423 "block_size": 512, 00:19:28.423 "num_blocks": 65536, 00:19:28.423 "uuid": "9b8d7d35-9527-439f-95f8-578ba7b730a3", 00:19:28.423 "assigned_rate_limits": { 00:19:28.423 "rw_ios_per_sec": 0, 00:19:28.423 "rw_mbytes_per_sec": 0, 00:19:28.423 "r_mbytes_per_sec": 0, 00:19:28.423 "w_mbytes_per_sec": 0 00:19:28.423 }, 00:19:28.423 "claimed": true, 00:19:28.423 "claim_type": "exclusive_write", 00:19:28.423 "zoned": false, 00:19:28.423 "supported_io_types": { 00:19:28.423 "read": true, 00:19:28.423 "write": true, 00:19:28.423 "unmap": true, 00:19:28.423 "flush": true, 00:19:28.423 "reset": true, 00:19:28.423 "nvme_admin": false, 00:19:28.423 "nvme_io": false, 00:19:28.423 "nvme_io_md": false, 00:19:28.423 "write_zeroes": true, 00:19:28.423 "zcopy": true, 00:19:28.423 "get_zone_info": false, 00:19:28.423 "zone_management": false, 00:19:28.423 "zone_append": false, 00:19:28.423 "compare": false, 00:19:28.423 "compare_and_write": false, 00:19:28.423 "abort": true, 00:19:28.423 "seek_hole": false, 00:19:28.423 "seek_data": false, 00:19:28.423 "copy": true, 00:19:28.423 "nvme_iov_md": false 00:19:28.423 }, 00:19:28.423 "memory_domains": [ 00:19:28.423 { 00:19:28.423 "dma_device_id": "system", 00:19:28.423 "dma_device_type": 1 00:19:28.423 }, 00:19:28.423 { 00:19:28.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.423 "dma_device_type": 2 00:19:28.423 } 00:19:28.423 ], 00:19:28.423 "driver_specific": {} 00:19:28.423 }' 00:19:28.423 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:28.423 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:28.423 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:28.423 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:28.423 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:28.423 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:28.423 13:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:28.728 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:28.728 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:28.728 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:28.728 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:28.728 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:28.728 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:28.728 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:28.728 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:28.987 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:28.987 "name": "BaseBdev3", 00:19:28.987 "aliases": [ 00:19:28.987 "b31a4d58-5781-4b6e-ac52-7142b4067fa5" 00:19:28.987 ], 00:19:28.987 "product_name": "Malloc disk", 00:19:28.987 "block_size": 512, 00:19:28.987 "num_blocks": 65536, 00:19:28.987 "uuid": "b31a4d58-5781-4b6e-ac52-7142b4067fa5", 00:19:28.987 "assigned_rate_limits": { 00:19:28.987 "rw_ios_per_sec": 0, 00:19:28.987 "rw_mbytes_per_sec": 0, 00:19:28.987 "r_mbytes_per_sec": 0, 00:19:28.987 "w_mbytes_per_sec": 0 00:19:28.987 }, 00:19:28.987 "claimed": true, 00:19:28.987 "claim_type": "exclusive_write", 00:19:28.987 "zoned": false, 00:19:28.987 "supported_io_types": { 00:19:28.987 "read": true, 00:19:28.987 "write": true, 00:19:28.987 "unmap": true, 00:19:28.987 "flush": true, 00:19:28.987 "reset": true, 00:19:28.987 "nvme_admin": false, 00:19:28.987 "nvme_io": false, 00:19:28.987 "nvme_io_md": false, 00:19:28.987 "write_zeroes": true, 00:19:28.987 "zcopy": true, 00:19:28.987 "get_zone_info": false, 00:19:28.987 "zone_management": false, 00:19:28.987 "zone_append": false, 00:19:28.987 "compare": false, 00:19:28.987 "compare_and_write": false, 00:19:28.987 "abort": true, 00:19:28.987 "seek_hole": false, 00:19:28.987 "seek_data": false, 00:19:28.987 "copy": true, 00:19:28.987 "nvme_iov_md": false 00:19:28.987 }, 00:19:28.987 "memory_domains": [ 00:19:28.987 { 00:19:28.987 "dma_device_id": "system", 00:19:28.987 "dma_device_type": 1 00:19:28.987 }, 00:19:28.987 { 00:19:28.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.987 "dma_device_type": 2 00:19:28.987 } 00:19:28.987 ], 00:19:28.987 "driver_specific": {} 00:19:28.987 }' 00:19:28.987 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:28.987 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:28.987 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:28.987 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:28.987 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:28.987 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:28.987 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.246 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.246 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:29.246 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.246 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.246 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:29.246 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.246 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:29.246 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:29.505 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:29.505 "name": "BaseBdev4", 00:19:29.505 "aliases": [ 00:19:29.505 "128991d3-e98d-4bed-b6a7-231eac35bfba" 00:19:29.505 ], 00:19:29.505 "product_name": "Malloc disk", 00:19:29.505 "block_size": 512, 00:19:29.505 "num_blocks": 65536, 00:19:29.505 "uuid": "128991d3-e98d-4bed-b6a7-231eac35bfba", 00:19:29.505 "assigned_rate_limits": { 00:19:29.505 "rw_ios_per_sec": 0, 00:19:29.505 "rw_mbytes_per_sec": 0, 00:19:29.505 "r_mbytes_per_sec": 0, 00:19:29.505 "w_mbytes_per_sec": 0 00:19:29.505 }, 00:19:29.505 "claimed": true, 00:19:29.505 "claim_type": "exclusive_write", 00:19:29.505 "zoned": false, 00:19:29.505 "supported_io_types": { 00:19:29.505 "read": true, 00:19:29.505 "write": true, 00:19:29.505 "unmap": true, 00:19:29.505 "flush": true, 00:19:29.505 "reset": true, 00:19:29.505 "nvme_admin": false, 00:19:29.505 "nvme_io": false, 00:19:29.505 "nvme_io_md": false, 00:19:29.505 "write_zeroes": true, 00:19:29.505 "zcopy": true, 00:19:29.505 "get_zone_info": false, 00:19:29.505 "zone_management": false, 00:19:29.505 "zone_append": false, 00:19:29.505 "compare": false, 00:19:29.505 "compare_and_write": false, 00:19:29.505 "abort": true, 00:19:29.505 "seek_hole": false, 00:19:29.505 "seek_data": false, 00:19:29.505 "copy": true, 00:19:29.505 "nvme_iov_md": false 00:19:29.505 }, 00:19:29.505 "memory_domains": [ 00:19:29.505 { 00:19:29.505 "dma_device_id": "system", 00:19:29.505 "dma_device_type": 1 00:19:29.505 }, 00:19:29.505 { 00:19:29.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.505 "dma_device_type": 2 00:19:29.505 } 00:19:29.505 ], 00:19:29.505 "driver_specific": {} 00:19:29.505 }' 00:19:29.505 13:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.505 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.505 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:29.505 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.764 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.764 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:29.764 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.764 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.764 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:29.764 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.764 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.764 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:29.764 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:30.023 [2024-07-12 13:45:18.546202] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:30.023 [2024-07-12 13:45:18.546234] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:30.023 [2024-07-12 13:45:18.546281] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:30.023 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:30.024 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.024 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:30.283 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:30.283 "name": "Existed_Raid", 00:19:30.283 "uuid": "d5d43ec9-1362-4f40-91f7-35c86e1a3ac0", 00:19:30.283 "strip_size_kb": 64, 00:19:30.283 "state": "offline", 00:19:30.283 "raid_level": "concat", 00:19:30.283 "superblock": false, 00:19:30.283 "num_base_bdevs": 4, 00:19:30.283 "num_base_bdevs_discovered": 3, 00:19:30.283 "num_base_bdevs_operational": 3, 00:19:30.283 "base_bdevs_list": [ 00:19:30.283 { 00:19:30.283 "name": null, 00:19:30.283 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:30.283 "is_configured": false, 00:19:30.283 "data_offset": 0, 00:19:30.283 "data_size": 65536 00:19:30.283 }, 00:19:30.283 { 00:19:30.283 "name": "BaseBdev2", 00:19:30.283 "uuid": "9b8d7d35-9527-439f-95f8-578ba7b730a3", 00:19:30.283 "is_configured": true, 00:19:30.283 "data_offset": 0, 00:19:30.283 "data_size": 65536 00:19:30.283 }, 00:19:30.283 { 00:19:30.283 "name": "BaseBdev3", 00:19:30.283 "uuid": "b31a4d58-5781-4b6e-ac52-7142b4067fa5", 00:19:30.283 "is_configured": true, 00:19:30.283 "data_offset": 0, 00:19:30.283 "data_size": 65536 00:19:30.283 }, 00:19:30.283 { 00:19:30.283 "name": "BaseBdev4", 00:19:30.283 "uuid": "128991d3-e98d-4bed-b6a7-231eac35bfba", 00:19:30.283 "is_configured": true, 00:19:30.283 "data_offset": 0, 00:19:30.283 "data_size": 65536 00:19:30.283 } 00:19:30.283 ] 00:19:30.283 }' 00:19:30.283 13:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:30.283 13:45:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.850 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:30.850 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:30.850 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.850 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:31.109 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:31.109 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:31.109 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:31.368 [2024-07-12 13:45:19.903713] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:31.368 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:31.368 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:31.368 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.368 13:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:31.627 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:31.627 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:31.627 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:31.886 [2024-07-12 13:45:20.423677] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:31.886 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:31.886 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:31.886 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.886 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:32.144 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:32.144 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:32.144 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:32.401 [2024-07-12 13:45:20.925449] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:32.401 [2024-07-12 13:45:20.925493] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19ebc40 name Existed_Raid, state offline 00:19:32.401 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:32.401 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:32.401 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.401 13:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:32.658 13:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:32.658 13:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:32.658 13:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:32.658 13:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:32.658 13:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:32.658 13:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:32.916 BaseBdev2 00:19:32.916 13:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:32.916 13:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:32.916 13:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:32.916 13:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:32.916 13:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:32.916 13:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:32.916 13:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:33.176 13:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:33.436 [ 00:19:33.436 { 00:19:33.436 "name": "BaseBdev2", 00:19:33.436 "aliases": [ 00:19:33.436 "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb" 00:19:33.436 ], 00:19:33.436 "product_name": "Malloc disk", 00:19:33.436 "block_size": 512, 00:19:33.436 "num_blocks": 65536, 00:19:33.436 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:33.436 "assigned_rate_limits": { 00:19:33.436 "rw_ios_per_sec": 0, 00:19:33.436 "rw_mbytes_per_sec": 0, 00:19:33.436 "r_mbytes_per_sec": 0, 00:19:33.436 "w_mbytes_per_sec": 0 00:19:33.436 }, 00:19:33.436 "claimed": false, 00:19:33.436 "zoned": false, 00:19:33.436 "supported_io_types": { 00:19:33.436 "read": true, 00:19:33.436 "write": true, 00:19:33.436 "unmap": true, 00:19:33.436 "flush": true, 00:19:33.436 "reset": true, 00:19:33.436 "nvme_admin": false, 00:19:33.436 "nvme_io": false, 00:19:33.436 "nvme_io_md": false, 00:19:33.436 "write_zeroes": true, 00:19:33.436 "zcopy": true, 00:19:33.436 "get_zone_info": false, 00:19:33.436 "zone_management": false, 00:19:33.436 "zone_append": false, 00:19:33.436 "compare": false, 00:19:33.436 "compare_and_write": false, 00:19:33.436 "abort": true, 00:19:33.436 "seek_hole": false, 00:19:33.436 "seek_data": false, 00:19:33.436 "copy": true, 00:19:33.436 "nvme_iov_md": false 00:19:33.436 }, 00:19:33.436 "memory_domains": [ 00:19:33.436 { 00:19:33.436 "dma_device_id": "system", 00:19:33.436 "dma_device_type": 1 00:19:33.436 }, 00:19:33.436 { 00:19:33.436 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.436 "dma_device_type": 2 00:19:33.436 } 00:19:33.436 ], 00:19:33.436 "driver_specific": {} 00:19:33.436 } 00:19:33.436 ] 00:19:33.436 13:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:33.436 13:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:33.436 13:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:33.436 13:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:33.694 BaseBdev3 00:19:33.694 13:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:33.695 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:33.695 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:33.695 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:33.695 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:33.695 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:33.695 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:33.952 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:34.210 [ 00:19:34.210 { 00:19:34.210 "name": "BaseBdev3", 00:19:34.210 "aliases": [ 00:19:34.210 "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586" 00:19:34.210 ], 00:19:34.210 "product_name": "Malloc disk", 00:19:34.210 "block_size": 512, 00:19:34.210 "num_blocks": 65536, 00:19:34.210 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:34.210 "assigned_rate_limits": { 00:19:34.210 "rw_ios_per_sec": 0, 00:19:34.210 "rw_mbytes_per_sec": 0, 00:19:34.210 "r_mbytes_per_sec": 0, 00:19:34.210 "w_mbytes_per_sec": 0 00:19:34.210 }, 00:19:34.210 "claimed": false, 00:19:34.210 "zoned": false, 00:19:34.210 "supported_io_types": { 00:19:34.210 "read": true, 00:19:34.210 "write": true, 00:19:34.210 "unmap": true, 00:19:34.210 "flush": true, 00:19:34.210 "reset": true, 00:19:34.210 "nvme_admin": false, 00:19:34.210 "nvme_io": false, 00:19:34.210 "nvme_io_md": false, 00:19:34.210 "write_zeroes": true, 00:19:34.210 "zcopy": true, 00:19:34.210 "get_zone_info": false, 00:19:34.210 "zone_management": false, 00:19:34.210 "zone_append": false, 00:19:34.210 "compare": false, 00:19:34.210 "compare_and_write": false, 00:19:34.210 "abort": true, 00:19:34.210 "seek_hole": false, 00:19:34.210 "seek_data": false, 00:19:34.210 "copy": true, 00:19:34.210 "nvme_iov_md": false 00:19:34.210 }, 00:19:34.210 "memory_domains": [ 00:19:34.210 { 00:19:34.210 "dma_device_id": "system", 00:19:34.210 "dma_device_type": 1 00:19:34.210 }, 00:19:34.210 { 00:19:34.210 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.210 "dma_device_type": 2 00:19:34.210 } 00:19:34.210 ], 00:19:34.210 "driver_specific": {} 00:19:34.210 } 00:19:34.210 ] 00:19:34.210 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:34.210 13:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:34.210 13:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:34.210 13:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:34.469 BaseBdev4 00:19:34.469 13:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:34.469 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:34.469 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:34.469 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:34.469 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:34.469 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:34.469 13:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:34.727 13:45:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:34.984 [ 00:19:34.984 { 00:19:34.984 "name": "BaseBdev4", 00:19:34.984 "aliases": [ 00:19:34.984 "8c39e635-89f7-4ace-8f00-272c1720c643" 00:19:34.984 ], 00:19:34.984 "product_name": "Malloc disk", 00:19:34.984 "block_size": 512, 00:19:34.984 "num_blocks": 65536, 00:19:34.984 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:34.984 "assigned_rate_limits": { 00:19:34.984 "rw_ios_per_sec": 0, 00:19:34.984 "rw_mbytes_per_sec": 0, 00:19:34.984 "r_mbytes_per_sec": 0, 00:19:34.984 "w_mbytes_per_sec": 0 00:19:34.985 }, 00:19:34.985 "claimed": false, 00:19:34.985 "zoned": false, 00:19:34.985 "supported_io_types": { 00:19:34.985 "read": true, 00:19:34.985 "write": true, 00:19:34.985 "unmap": true, 00:19:34.985 "flush": true, 00:19:34.985 "reset": true, 00:19:34.985 "nvme_admin": false, 00:19:34.985 "nvme_io": false, 00:19:34.985 "nvme_io_md": false, 00:19:34.985 "write_zeroes": true, 00:19:34.985 "zcopy": true, 00:19:34.985 "get_zone_info": false, 00:19:34.985 "zone_management": false, 00:19:34.985 "zone_append": false, 00:19:34.985 "compare": false, 00:19:34.985 "compare_and_write": false, 00:19:34.985 "abort": true, 00:19:34.985 "seek_hole": false, 00:19:34.985 "seek_data": false, 00:19:34.985 "copy": true, 00:19:34.985 "nvme_iov_md": false 00:19:34.985 }, 00:19:34.985 "memory_domains": [ 00:19:34.985 { 00:19:34.985 "dma_device_id": "system", 00:19:34.985 "dma_device_type": 1 00:19:34.985 }, 00:19:34.985 { 00:19:34.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.985 "dma_device_type": 2 00:19:34.985 } 00:19:34.985 ], 00:19:34.985 "driver_specific": {} 00:19:34.985 } 00:19:34.985 ] 00:19:34.985 13:45:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:34.985 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:34.985 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:34.985 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:35.552 [2024-07-12 13:45:23.866596] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:35.552 [2024-07-12 13:45:23.866642] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:35.552 [2024-07-12 13:45:23.866663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:35.552 [2024-07-12 13:45:23.868045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:35.552 [2024-07-12 13:45:23.868087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.552 13:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:35.811 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.811 "name": "Existed_Raid", 00:19:35.811 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.811 "strip_size_kb": 64, 00:19:35.811 "state": "configuring", 00:19:35.811 "raid_level": "concat", 00:19:35.811 "superblock": false, 00:19:35.811 "num_base_bdevs": 4, 00:19:35.811 "num_base_bdevs_discovered": 3, 00:19:35.811 "num_base_bdevs_operational": 4, 00:19:35.811 "base_bdevs_list": [ 00:19:35.811 { 00:19:35.811 "name": "BaseBdev1", 00:19:35.811 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.811 "is_configured": false, 00:19:35.811 "data_offset": 0, 00:19:35.811 "data_size": 0 00:19:35.811 }, 00:19:35.811 { 00:19:35.811 "name": "BaseBdev2", 00:19:35.811 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:35.811 "is_configured": true, 00:19:35.811 "data_offset": 0, 00:19:35.811 "data_size": 65536 00:19:35.811 }, 00:19:35.811 { 00:19:35.811 "name": "BaseBdev3", 00:19:35.811 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:35.811 "is_configured": true, 00:19:35.811 "data_offset": 0, 00:19:35.811 "data_size": 65536 00:19:35.811 }, 00:19:35.811 { 00:19:35.811 "name": "BaseBdev4", 00:19:35.811 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:35.811 "is_configured": true, 00:19:35.811 "data_offset": 0, 00:19:35.811 "data_size": 65536 00:19:35.811 } 00:19:35.811 ] 00:19:35.811 }' 00:19:35.811 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.811 13:45:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:36.378 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:36.637 [2024-07-12 13:45:24.965596] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.637 13:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:36.637 13:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.637 "name": "Existed_Raid", 00:19:36.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.637 "strip_size_kb": 64, 00:19:36.637 "state": "configuring", 00:19:36.637 "raid_level": "concat", 00:19:36.637 "superblock": false, 00:19:36.637 "num_base_bdevs": 4, 00:19:36.637 "num_base_bdevs_discovered": 2, 00:19:36.637 "num_base_bdevs_operational": 4, 00:19:36.637 "base_bdevs_list": [ 00:19:36.637 { 00:19:36.637 "name": "BaseBdev1", 00:19:36.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.637 "is_configured": false, 00:19:36.637 "data_offset": 0, 00:19:36.637 "data_size": 0 00:19:36.637 }, 00:19:36.637 { 00:19:36.637 "name": null, 00:19:36.637 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:36.637 "is_configured": false, 00:19:36.637 "data_offset": 0, 00:19:36.637 "data_size": 65536 00:19:36.637 }, 00:19:36.637 { 00:19:36.637 "name": "BaseBdev3", 00:19:36.637 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:36.637 "is_configured": true, 00:19:36.637 "data_offset": 0, 00:19:36.637 "data_size": 65536 00:19:36.637 }, 00:19:36.637 { 00:19:36.637 "name": "BaseBdev4", 00:19:36.637 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:36.637 "is_configured": true, 00:19:36.637 "data_offset": 0, 00:19:36.637 "data_size": 65536 00:19:36.637 } 00:19:36.637 ] 00:19:36.637 }' 00:19:36.637 13:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.637 13:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.205 13:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.205 13:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:37.463 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:37.463 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:37.722 [2024-07-12 13:45:26.249592] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:37.722 BaseBdev1 00:19:37.722 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:37.722 13:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:37.722 13:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:37.722 13:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:37.722 13:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:37.722 13:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:37.722 13:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:37.981 13:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:38.240 [ 00:19:38.240 { 00:19:38.240 "name": "BaseBdev1", 00:19:38.240 "aliases": [ 00:19:38.240 "26fc70f0-03a4-45f9-aaa8-565ad0f95e74" 00:19:38.240 ], 00:19:38.240 "product_name": "Malloc disk", 00:19:38.240 "block_size": 512, 00:19:38.240 "num_blocks": 65536, 00:19:38.240 "uuid": "26fc70f0-03a4-45f9-aaa8-565ad0f95e74", 00:19:38.240 "assigned_rate_limits": { 00:19:38.240 "rw_ios_per_sec": 0, 00:19:38.240 "rw_mbytes_per_sec": 0, 00:19:38.240 "r_mbytes_per_sec": 0, 00:19:38.240 "w_mbytes_per_sec": 0 00:19:38.240 }, 00:19:38.240 "claimed": true, 00:19:38.240 "claim_type": "exclusive_write", 00:19:38.240 "zoned": false, 00:19:38.240 "supported_io_types": { 00:19:38.240 "read": true, 00:19:38.240 "write": true, 00:19:38.240 "unmap": true, 00:19:38.240 "flush": true, 00:19:38.240 "reset": true, 00:19:38.240 "nvme_admin": false, 00:19:38.240 "nvme_io": false, 00:19:38.240 "nvme_io_md": false, 00:19:38.240 "write_zeroes": true, 00:19:38.240 "zcopy": true, 00:19:38.240 "get_zone_info": false, 00:19:38.240 "zone_management": false, 00:19:38.240 "zone_append": false, 00:19:38.240 "compare": false, 00:19:38.240 "compare_and_write": false, 00:19:38.240 "abort": true, 00:19:38.240 "seek_hole": false, 00:19:38.240 "seek_data": false, 00:19:38.240 "copy": true, 00:19:38.240 "nvme_iov_md": false 00:19:38.240 }, 00:19:38.240 "memory_domains": [ 00:19:38.240 { 00:19:38.240 "dma_device_id": "system", 00:19:38.240 "dma_device_type": 1 00:19:38.240 }, 00:19:38.240 { 00:19:38.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.240 "dma_device_type": 2 00:19:38.240 } 00:19:38.240 ], 00:19:38.240 "driver_specific": {} 00:19:38.240 } 00:19:38.240 ] 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.240 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:38.500 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:38.500 "name": "Existed_Raid", 00:19:38.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:38.500 "strip_size_kb": 64, 00:19:38.500 "state": "configuring", 00:19:38.500 "raid_level": "concat", 00:19:38.500 "superblock": false, 00:19:38.500 "num_base_bdevs": 4, 00:19:38.500 "num_base_bdevs_discovered": 3, 00:19:38.500 "num_base_bdevs_operational": 4, 00:19:38.500 "base_bdevs_list": [ 00:19:38.500 { 00:19:38.500 "name": "BaseBdev1", 00:19:38.500 "uuid": "26fc70f0-03a4-45f9-aaa8-565ad0f95e74", 00:19:38.500 "is_configured": true, 00:19:38.500 "data_offset": 0, 00:19:38.500 "data_size": 65536 00:19:38.500 }, 00:19:38.500 { 00:19:38.500 "name": null, 00:19:38.500 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:38.500 "is_configured": false, 00:19:38.500 "data_offset": 0, 00:19:38.500 "data_size": 65536 00:19:38.500 }, 00:19:38.500 { 00:19:38.500 "name": "BaseBdev3", 00:19:38.500 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:38.500 "is_configured": true, 00:19:38.500 "data_offset": 0, 00:19:38.500 "data_size": 65536 00:19:38.500 }, 00:19:38.500 { 00:19:38.500 "name": "BaseBdev4", 00:19:38.500 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:38.500 "is_configured": true, 00:19:38.500 "data_offset": 0, 00:19:38.500 "data_size": 65536 00:19:38.500 } 00:19:38.500 ] 00:19:38.500 }' 00:19:38.500 13:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:38.500 13:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.067 13:45:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.067 13:45:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:39.326 13:45:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:39.326 13:45:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:39.585 [2024-07-12 13:45:28.022313] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.585 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:39.844 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.844 "name": "Existed_Raid", 00:19:39.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.844 "strip_size_kb": 64, 00:19:39.844 "state": "configuring", 00:19:39.844 "raid_level": "concat", 00:19:39.844 "superblock": false, 00:19:39.844 "num_base_bdevs": 4, 00:19:39.844 "num_base_bdevs_discovered": 2, 00:19:39.844 "num_base_bdevs_operational": 4, 00:19:39.844 "base_bdevs_list": [ 00:19:39.844 { 00:19:39.844 "name": "BaseBdev1", 00:19:39.844 "uuid": "26fc70f0-03a4-45f9-aaa8-565ad0f95e74", 00:19:39.844 "is_configured": true, 00:19:39.844 "data_offset": 0, 00:19:39.844 "data_size": 65536 00:19:39.844 }, 00:19:39.844 { 00:19:39.844 "name": null, 00:19:39.844 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:39.844 "is_configured": false, 00:19:39.844 "data_offset": 0, 00:19:39.844 "data_size": 65536 00:19:39.844 }, 00:19:39.844 { 00:19:39.844 "name": null, 00:19:39.844 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:39.844 "is_configured": false, 00:19:39.844 "data_offset": 0, 00:19:39.844 "data_size": 65536 00:19:39.844 }, 00:19:39.844 { 00:19:39.844 "name": "BaseBdev4", 00:19:39.844 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:39.844 "is_configured": true, 00:19:39.844 "data_offset": 0, 00:19:39.844 "data_size": 65536 00:19:39.844 } 00:19:39.844 ] 00:19:39.844 }' 00:19:39.844 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.844 13:45:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:40.412 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.412 13:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:40.671 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:40.671 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:41.236 [2024-07-12 13:45:29.634600] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.236 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:41.495 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.495 "name": "Existed_Raid", 00:19:41.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.495 "strip_size_kb": 64, 00:19:41.495 "state": "configuring", 00:19:41.495 "raid_level": "concat", 00:19:41.495 "superblock": false, 00:19:41.495 "num_base_bdevs": 4, 00:19:41.495 "num_base_bdevs_discovered": 3, 00:19:41.495 "num_base_bdevs_operational": 4, 00:19:41.495 "base_bdevs_list": [ 00:19:41.495 { 00:19:41.495 "name": "BaseBdev1", 00:19:41.495 "uuid": "26fc70f0-03a4-45f9-aaa8-565ad0f95e74", 00:19:41.495 "is_configured": true, 00:19:41.495 "data_offset": 0, 00:19:41.495 "data_size": 65536 00:19:41.495 }, 00:19:41.495 { 00:19:41.495 "name": null, 00:19:41.495 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:41.495 "is_configured": false, 00:19:41.495 "data_offset": 0, 00:19:41.495 "data_size": 65536 00:19:41.495 }, 00:19:41.495 { 00:19:41.495 "name": "BaseBdev3", 00:19:41.495 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:41.495 "is_configured": true, 00:19:41.495 "data_offset": 0, 00:19:41.495 "data_size": 65536 00:19:41.495 }, 00:19:41.495 { 00:19:41.495 "name": "BaseBdev4", 00:19:41.495 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:41.495 "is_configured": true, 00:19:41.495 "data_offset": 0, 00:19:41.495 "data_size": 65536 00:19:41.495 } 00:19:41.495 ] 00:19:41.495 }' 00:19:41.495 13:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.495 13:45:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.062 13:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.062 13:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:42.321 13:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:42.321 13:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:42.579 [2024-07-12 13:45:30.982194] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.579 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:42.838 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:42.838 "name": "Existed_Raid", 00:19:42.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:42.838 "strip_size_kb": 64, 00:19:42.838 "state": "configuring", 00:19:42.838 "raid_level": "concat", 00:19:42.838 "superblock": false, 00:19:42.838 "num_base_bdevs": 4, 00:19:42.838 "num_base_bdevs_discovered": 2, 00:19:42.838 "num_base_bdevs_operational": 4, 00:19:42.838 "base_bdevs_list": [ 00:19:42.838 { 00:19:42.838 "name": null, 00:19:42.838 "uuid": "26fc70f0-03a4-45f9-aaa8-565ad0f95e74", 00:19:42.838 "is_configured": false, 00:19:42.838 "data_offset": 0, 00:19:42.838 "data_size": 65536 00:19:42.838 }, 00:19:42.838 { 00:19:42.838 "name": null, 00:19:42.838 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:42.838 "is_configured": false, 00:19:42.838 "data_offset": 0, 00:19:42.838 "data_size": 65536 00:19:42.838 }, 00:19:42.838 { 00:19:42.839 "name": "BaseBdev3", 00:19:42.839 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:42.839 "is_configured": true, 00:19:42.839 "data_offset": 0, 00:19:42.839 "data_size": 65536 00:19:42.839 }, 00:19:42.839 { 00:19:42.839 "name": "BaseBdev4", 00:19:42.839 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:42.839 "is_configured": true, 00:19:42.839 "data_offset": 0, 00:19:42.839 "data_size": 65536 00:19:42.839 } 00:19:42.839 ] 00:19:42.839 }' 00:19:42.839 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:42.839 13:45:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.405 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.405 13:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:43.664 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:43.664 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:43.923 [2024-07-12 13:45:32.333574] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.923 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:44.182 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:44.182 "name": "Existed_Raid", 00:19:44.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.182 "strip_size_kb": 64, 00:19:44.182 "state": "configuring", 00:19:44.182 "raid_level": "concat", 00:19:44.182 "superblock": false, 00:19:44.182 "num_base_bdevs": 4, 00:19:44.182 "num_base_bdevs_discovered": 3, 00:19:44.182 "num_base_bdevs_operational": 4, 00:19:44.182 "base_bdevs_list": [ 00:19:44.182 { 00:19:44.182 "name": null, 00:19:44.182 "uuid": "26fc70f0-03a4-45f9-aaa8-565ad0f95e74", 00:19:44.182 "is_configured": false, 00:19:44.182 "data_offset": 0, 00:19:44.182 "data_size": 65536 00:19:44.182 }, 00:19:44.182 { 00:19:44.182 "name": "BaseBdev2", 00:19:44.182 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:44.182 "is_configured": true, 00:19:44.182 "data_offset": 0, 00:19:44.182 "data_size": 65536 00:19:44.182 }, 00:19:44.182 { 00:19:44.182 "name": "BaseBdev3", 00:19:44.182 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:44.182 "is_configured": true, 00:19:44.182 "data_offset": 0, 00:19:44.182 "data_size": 65536 00:19:44.182 }, 00:19:44.182 { 00:19:44.182 "name": "BaseBdev4", 00:19:44.182 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:44.182 "is_configured": true, 00:19:44.182 "data_offset": 0, 00:19:44.182 "data_size": 65536 00:19:44.182 } 00:19:44.182 ] 00:19:44.182 }' 00:19:44.182 13:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:44.182 13:45:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:44.749 13:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.749 13:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:45.008 13:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:45.008 13:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.008 13:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:45.267 13:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 26fc70f0-03a4-45f9-aaa8-565ad0f95e74 00:19:45.526 [2024-07-12 13:45:33.962495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:45.526 [2024-07-12 13:45:33.962535] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19efa20 00:19:45.526 [2024-07-12 13:45:33.962544] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:45.526 [2024-07-12 13:45:33.962745] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19eb320 00:19:45.526 [2024-07-12 13:45:33.962862] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19efa20 00:19:45.526 [2024-07-12 13:45:33.962871] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19efa20 00:19:45.526 [2024-07-12 13:45:33.963046] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:45.526 NewBaseBdev 00:19:45.526 13:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:45.526 13:45:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:45.526 13:45:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:45.526 13:45:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:45.526 13:45:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:45.526 13:45:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:45.526 13:45:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:45.785 13:45:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:46.044 [ 00:19:46.044 { 00:19:46.044 "name": "NewBaseBdev", 00:19:46.044 "aliases": [ 00:19:46.044 "26fc70f0-03a4-45f9-aaa8-565ad0f95e74" 00:19:46.044 ], 00:19:46.044 "product_name": "Malloc disk", 00:19:46.044 "block_size": 512, 00:19:46.044 "num_blocks": 65536, 00:19:46.044 "uuid": "26fc70f0-03a4-45f9-aaa8-565ad0f95e74", 00:19:46.044 "assigned_rate_limits": { 00:19:46.044 "rw_ios_per_sec": 0, 00:19:46.044 "rw_mbytes_per_sec": 0, 00:19:46.044 "r_mbytes_per_sec": 0, 00:19:46.044 "w_mbytes_per_sec": 0 00:19:46.044 }, 00:19:46.044 "claimed": true, 00:19:46.044 "claim_type": "exclusive_write", 00:19:46.044 "zoned": false, 00:19:46.044 "supported_io_types": { 00:19:46.044 "read": true, 00:19:46.044 "write": true, 00:19:46.044 "unmap": true, 00:19:46.044 "flush": true, 00:19:46.044 "reset": true, 00:19:46.044 "nvme_admin": false, 00:19:46.044 "nvme_io": false, 00:19:46.044 "nvme_io_md": false, 00:19:46.044 "write_zeroes": true, 00:19:46.044 "zcopy": true, 00:19:46.044 "get_zone_info": false, 00:19:46.044 "zone_management": false, 00:19:46.044 "zone_append": false, 00:19:46.044 "compare": false, 00:19:46.044 "compare_and_write": false, 00:19:46.044 "abort": true, 00:19:46.044 "seek_hole": false, 00:19:46.044 "seek_data": false, 00:19:46.044 "copy": true, 00:19:46.044 "nvme_iov_md": false 00:19:46.044 }, 00:19:46.044 "memory_domains": [ 00:19:46.044 { 00:19:46.044 "dma_device_id": "system", 00:19:46.044 "dma_device_type": 1 00:19:46.044 }, 00:19:46.044 { 00:19:46.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.044 "dma_device_type": 2 00:19:46.044 } 00:19:46.044 ], 00:19:46.044 "driver_specific": {} 00:19:46.044 } 00:19:46.044 ] 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:46.044 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.303 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.303 "name": "Existed_Raid", 00:19:46.303 "uuid": "398f863d-c393-4684-b127-450581e58dae", 00:19:46.303 "strip_size_kb": 64, 00:19:46.303 "state": "online", 00:19:46.303 "raid_level": "concat", 00:19:46.303 "superblock": false, 00:19:46.303 "num_base_bdevs": 4, 00:19:46.303 "num_base_bdevs_discovered": 4, 00:19:46.303 "num_base_bdevs_operational": 4, 00:19:46.303 "base_bdevs_list": [ 00:19:46.303 { 00:19:46.303 "name": "NewBaseBdev", 00:19:46.303 "uuid": "26fc70f0-03a4-45f9-aaa8-565ad0f95e74", 00:19:46.303 "is_configured": true, 00:19:46.303 "data_offset": 0, 00:19:46.303 "data_size": 65536 00:19:46.303 }, 00:19:46.303 { 00:19:46.303 "name": "BaseBdev2", 00:19:46.303 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:46.303 "is_configured": true, 00:19:46.303 "data_offset": 0, 00:19:46.303 "data_size": 65536 00:19:46.303 }, 00:19:46.303 { 00:19:46.303 "name": "BaseBdev3", 00:19:46.303 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:46.303 "is_configured": true, 00:19:46.303 "data_offset": 0, 00:19:46.303 "data_size": 65536 00:19:46.303 }, 00:19:46.303 { 00:19:46.303 "name": "BaseBdev4", 00:19:46.303 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:46.303 "is_configured": true, 00:19:46.303 "data_offset": 0, 00:19:46.303 "data_size": 65536 00:19:46.303 } 00:19:46.303 ] 00:19:46.303 }' 00:19:46.303 13:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.303 13:45:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:46.868 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:46.868 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:46.868 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:46.868 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:46.868 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:46.868 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:46.868 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:46.868 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:47.126 [2024-07-12 13:45:35.531004] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:47.126 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:47.126 "name": "Existed_Raid", 00:19:47.126 "aliases": [ 00:19:47.126 "398f863d-c393-4684-b127-450581e58dae" 00:19:47.126 ], 00:19:47.126 "product_name": "Raid Volume", 00:19:47.126 "block_size": 512, 00:19:47.126 "num_blocks": 262144, 00:19:47.126 "uuid": "398f863d-c393-4684-b127-450581e58dae", 00:19:47.126 "assigned_rate_limits": { 00:19:47.126 "rw_ios_per_sec": 0, 00:19:47.126 "rw_mbytes_per_sec": 0, 00:19:47.126 "r_mbytes_per_sec": 0, 00:19:47.126 "w_mbytes_per_sec": 0 00:19:47.126 }, 00:19:47.126 "claimed": false, 00:19:47.126 "zoned": false, 00:19:47.126 "supported_io_types": { 00:19:47.126 "read": true, 00:19:47.126 "write": true, 00:19:47.126 "unmap": true, 00:19:47.126 "flush": true, 00:19:47.126 "reset": true, 00:19:47.126 "nvme_admin": false, 00:19:47.126 "nvme_io": false, 00:19:47.126 "nvme_io_md": false, 00:19:47.126 "write_zeroes": true, 00:19:47.126 "zcopy": false, 00:19:47.126 "get_zone_info": false, 00:19:47.126 "zone_management": false, 00:19:47.126 "zone_append": false, 00:19:47.126 "compare": false, 00:19:47.126 "compare_and_write": false, 00:19:47.126 "abort": false, 00:19:47.126 "seek_hole": false, 00:19:47.126 "seek_data": false, 00:19:47.126 "copy": false, 00:19:47.126 "nvme_iov_md": false 00:19:47.126 }, 00:19:47.126 "memory_domains": [ 00:19:47.126 { 00:19:47.126 "dma_device_id": "system", 00:19:47.126 "dma_device_type": 1 00:19:47.126 }, 00:19:47.126 { 00:19:47.126 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.126 "dma_device_type": 2 00:19:47.126 }, 00:19:47.126 { 00:19:47.126 "dma_device_id": "system", 00:19:47.126 "dma_device_type": 1 00:19:47.126 }, 00:19:47.126 { 00:19:47.126 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.126 "dma_device_type": 2 00:19:47.126 }, 00:19:47.126 { 00:19:47.126 "dma_device_id": "system", 00:19:47.126 "dma_device_type": 1 00:19:47.126 }, 00:19:47.126 { 00:19:47.126 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.126 "dma_device_type": 2 00:19:47.126 }, 00:19:47.126 { 00:19:47.126 "dma_device_id": "system", 00:19:47.126 "dma_device_type": 1 00:19:47.126 }, 00:19:47.126 { 00:19:47.126 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.126 "dma_device_type": 2 00:19:47.126 } 00:19:47.126 ], 00:19:47.126 "driver_specific": { 00:19:47.126 "raid": { 00:19:47.126 "uuid": "398f863d-c393-4684-b127-450581e58dae", 00:19:47.126 "strip_size_kb": 64, 00:19:47.126 "state": "online", 00:19:47.126 "raid_level": "concat", 00:19:47.126 "superblock": false, 00:19:47.126 "num_base_bdevs": 4, 00:19:47.126 "num_base_bdevs_discovered": 4, 00:19:47.126 "num_base_bdevs_operational": 4, 00:19:47.126 "base_bdevs_list": [ 00:19:47.126 { 00:19:47.126 "name": "NewBaseBdev", 00:19:47.126 "uuid": "26fc70f0-03a4-45f9-aaa8-565ad0f95e74", 00:19:47.126 "is_configured": true, 00:19:47.126 "data_offset": 0, 00:19:47.126 "data_size": 65536 00:19:47.126 }, 00:19:47.126 { 00:19:47.126 "name": "BaseBdev2", 00:19:47.126 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:47.126 "is_configured": true, 00:19:47.126 "data_offset": 0, 00:19:47.126 "data_size": 65536 00:19:47.126 }, 00:19:47.126 { 00:19:47.126 "name": "BaseBdev3", 00:19:47.126 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:47.126 "is_configured": true, 00:19:47.126 "data_offset": 0, 00:19:47.126 "data_size": 65536 00:19:47.126 }, 00:19:47.126 { 00:19:47.126 "name": "BaseBdev4", 00:19:47.126 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:47.126 "is_configured": true, 00:19:47.126 "data_offset": 0, 00:19:47.126 "data_size": 65536 00:19:47.126 } 00:19:47.126 ] 00:19:47.126 } 00:19:47.126 } 00:19:47.126 }' 00:19:47.126 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:47.126 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:47.126 BaseBdev2 00:19:47.126 BaseBdev3 00:19:47.126 BaseBdev4' 00:19:47.126 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:47.126 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:47.126 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:47.384 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:47.384 "name": "NewBaseBdev", 00:19:47.384 "aliases": [ 00:19:47.384 "26fc70f0-03a4-45f9-aaa8-565ad0f95e74" 00:19:47.384 ], 00:19:47.384 "product_name": "Malloc disk", 00:19:47.384 "block_size": 512, 00:19:47.384 "num_blocks": 65536, 00:19:47.384 "uuid": "26fc70f0-03a4-45f9-aaa8-565ad0f95e74", 00:19:47.384 "assigned_rate_limits": { 00:19:47.384 "rw_ios_per_sec": 0, 00:19:47.384 "rw_mbytes_per_sec": 0, 00:19:47.384 "r_mbytes_per_sec": 0, 00:19:47.384 "w_mbytes_per_sec": 0 00:19:47.384 }, 00:19:47.384 "claimed": true, 00:19:47.384 "claim_type": "exclusive_write", 00:19:47.384 "zoned": false, 00:19:47.384 "supported_io_types": { 00:19:47.384 "read": true, 00:19:47.384 "write": true, 00:19:47.384 "unmap": true, 00:19:47.384 "flush": true, 00:19:47.384 "reset": true, 00:19:47.384 "nvme_admin": false, 00:19:47.384 "nvme_io": false, 00:19:47.384 "nvme_io_md": false, 00:19:47.384 "write_zeroes": true, 00:19:47.384 "zcopy": true, 00:19:47.384 "get_zone_info": false, 00:19:47.384 "zone_management": false, 00:19:47.384 "zone_append": false, 00:19:47.384 "compare": false, 00:19:47.384 "compare_and_write": false, 00:19:47.384 "abort": true, 00:19:47.384 "seek_hole": false, 00:19:47.384 "seek_data": false, 00:19:47.384 "copy": true, 00:19:47.384 "nvme_iov_md": false 00:19:47.384 }, 00:19:47.384 "memory_domains": [ 00:19:47.384 { 00:19:47.384 "dma_device_id": "system", 00:19:47.384 "dma_device_type": 1 00:19:47.384 }, 00:19:47.384 { 00:19:47.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.384 "dma_device_type": 2 00:19:47.384 } 00:19:47.384 ], 00:19:47.384 "driver_specific": {} 00:19:47.384 }' 00:19:47.384 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.384 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.384 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:47.384 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.642 13:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.642 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:47.642 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.642 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.642 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:47.642 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.642 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.642 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:47.642 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:47.642 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:47.642 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:47.901 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:47.901 "name": "BaseBdev2", 00:19:47.901 "aliases": [ 00:19:47.901 "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb" 00:19:47.901 ], 00:19:47.901 "product_name": "Malloc disk", 00:19:47.901 "block_size": 512, 00:19:47.901 "num_blocks": 65536, 00:19:47.901 "uuid": "e75cc9a8-7b55-48d5-8a08-94e07a66ebdb", 00:19:47.901 "assigned_rate_limits": { 00:19:47.901 "rw_ios_per_sec": 0, 00:19:47.901 "rw_mbytes_per_sec": 0, 00:19:47.901 "r_mbytes_per_sec": 0, 00:19:47.901 "w_mbytes_per_sec": 0 00:19:47.901 }, 00:19:47.901 "claimed": true, 00:19:47.901 "claim_type": "exclusive_write", 00:19:47.901 "zoned": false, 00:19:47.901 "supported_io_types": { 00:19:47.901 "read": true, 00:19:47.901 "write": true, 00:19:47.901 "unmap": true, 00:19:47.901 "flush": true, 00:19:47.901 "reset": true, 00:19:47.901 "nvme_admin": false, 00:19:47.901 "nvme_io": false, 00:19:47.901 "nvme_io_md": false, 00:19:47.901 "write_zeroes": true, 00:19:47.901 "zcopy": true, 00:19:47.901 "get_zone_info": false, 00:19:47.901 "zone_management": false, 00:19:47.901 "zone_append": false, 00:19:47.901 "compare": false, 00:19:47.901 "compare_and_write": false, 00:19:47.901 "abort": true, 00:19:47.901 "seek_hole": false, 00:19:47.901 "seek_data": false, 00:19:47.901 "copy": true, 00:19:47.901 "nvme_iov_md": false 00:19:47.901 }, 00:19:47.901 "memory_domains": [ 00:19:47.901 { 00:19:47.901 "dma_device_id": "system", 00:19:47.901 "dma_device_type": 1 00:19:47.901 }, 00:19:47.901 { 00:19:47.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.901 "dma_device_type": 2 00:19:47.901 } 00:19:47.901 ], 00:19:47.901 "driver_specific": {} 00:19:47.901 }' 00:19:47.901 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.159 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.159 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:48.159 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:48.159 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:48.159 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:48.159 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:48.159 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:48.159 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:48.159 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.418 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.418 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:48.418 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:48.418 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:48.418 13:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:48.676 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:48.676 "name": "BaseBdev3", 00:19:48.676 "aliases": [ 00:19:48.676 "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586" 00:19:48.676 ], 00:19:48.676 "product_name": "Malloc disk", 00:19:48.676 "block_size": 512, 00:19:48.676 "num_blocks": 65536, 00:19:48.676 "uuid": "d61b5dfe-4d2b-4e75-b6d6-fccd1570d586", 00:19:48.676 "assigned_rate_limits": { 00:19:48.676 "rw_ios_per_sec": 0, 00:19:48.676 "rw_mbytes_per_sec": 0, 00:19:48.676 "r_mbytes_per_sec": 0, 00:19:48.676 "w_mbytes_per_sec": 0 00:19:48.676 }, 00:19:48.676 "claimed": true, 00:19:48.676 "claim_type": "exclusive_write", 00:19:48.676 "zoned": false, 00:19:48.676 "supported_io_types": { 00:19:48.676 "read": true, 00:19:48.676 "write": true, 00:19:48.676 "unmap": true, 00:19:48.676 "flush": true, 00:19:48.676 "reset": true, 00:19:48.676 "nvme_admin": false, 00:19:48.676 "nvme_io": false, 00:19:48.676 "nvme_io_md": false, 00:19:48.676 "write_zeroes": true, 00:19:48.676 "zcopy": true, 00:19:48.676 "get_zone_info": false, 00:19:48.676 "zone_management": false, 00:19:48.676 "zone_append": false, 00:19:48.676 "compare": false, 00:19:48.676 "compare_and_write": false, 00:19:48.676 "abort": true, 00:19:48.676 "seek_hole": false, 00:19:48.676 "seek_data": false, 00:19:48.676 "copy": true, 00:19:48.676 "nvme_iov_md": false 00:19:48.676 }, 00:19:48.676 "memory_domains": [ 00:19:48.676 { 00:19:48.676 "dma_device_id": "system", 00:19:48.676 "dma_device_type": 1 00:19:48.676 }, 00:19:48.676 { 00:19:48.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.676 "dma_device_type": 2 00:19:48.676 } 00:19:48.676 ], 00:19:48.676 "driver_specific": {} 00:19:48.676 }' 00:19:48.676 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.676 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.676 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:48.676 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:48.676 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:48.676 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:48.676 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:48.935 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:48.935 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:48.935 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.935 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.935 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:48.935 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:48.935 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:48.935 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:49.193 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:49.194 "name": "BaseBdev4", 00:19:49.194 "aliases": [ 00:19:49.194 "8c39e635-89f7-4ace-8f00-272c1720c643" 00:19:49.194 ], 00:19:49.194 "product_name": "Malloc disk", 00:19:49.194 "block_size": 512, 00:19:49.194 "num_blocks": 65536, 00:19:49.194 "uuid": "8c39e635-89f7-4ace-8f00-272c1720c643", 00:19:49.194 "assigned_rate_limits": { 00:19:49.194 "rw_ios_per_sec": 0, 00:19:49.194 "rw_mbytes_per_sec": 0, 00:19:49.194 "r_mbytes_per_sec": 0, 00:19:49.194 "w_mbytes_per_sec": 0 00:19:49.194 }, 00:19:49.194 "claimed": true, 00:19:49.194 "claim_type": "exclusive_write", 00:19:49.194 "zoned": false, 00:19:49.194 "supported_io_types": { 00:19:49.194 "read": true, 00:19:49.194 "write": true, 00:19:49.194 "unmap": true, 00:19:49.194 "flush": true, 00:19:49.194 "reset": true, 00:19:49.194 "nvme_admin": false, 00:19:49.194 "nvme_io": false, 00:19:49.194 "nvme_io_md": false, 00:19:49.194 "write_zeroes": true, 00:19:49.194 "zcopy": true, 00:19:49.194 "get_zone_info": false, 00:19:49.194 "zone_management": false, 00:19:49.194 "zone_append": false, 00:19:49.194 "compare": false, 00:19:49.194 "compare_and_write": false, 00:19:49.194 "abort": true, 00:19:49.194 "seek_hole": false, 00:19:49.194 "seek_data": false, 00:19:49.194 "copy": true, 00:19:49.194 "nvme_iov_md": false 00:19:49.194 }, 00:19:49.194 "memory_domains": [ 00:19:49.194 { 00:19:49.194 "dma_device_id": "system", 00:19:49.194 "dma_device_type": 1 00:19:49.194 }, 00:19:49.194 { 00:19:49.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.194 "dma_device_type": 2 00:19:49.194 } 00:19:49.194 ], 00:19:49.194 "driver_specific": {} 00:19:49.194 }' 00:19:49.194 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.194 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.194 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:49.194 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.453 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.453 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:49.453 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.453 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.453 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:49.453 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.453 13:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.453 13:45:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:49.453 13:45:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:49.712 [2024-07-12 13:45:38.257942] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:49.712 [2024-07-12 13:45:38.257974] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:49.712 [2024-07-12 13:45:38.258032] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:49.712 [2024-07-12 13:45:38.258091] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:49.712 [2024-07-12 13:45:38.258103] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19efa20 name Existed_Raid, state offline 00:19:49.712 13:45:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 505650 00:19:49.712 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 505650 ']' 00:19:49.712 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 505650 00:19:49.712 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:49.712 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:49.712 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 505650 00:19:49.971 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:49.971 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:49.971 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 505650' 00:19:49.971 killing process with pid 505650 00:19:49.971 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 505650 00:19:49.971 [2024-07-12 13:45:38.328462] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:49.971 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 505650 00:19:49.971 [2024-07-12 13:45:38.364877] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:50.230 00:19:50.230 real 0m33.076s 00:19:50.230 user 1m0.830s 00:19:50.230 sys 0m5.879s 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.230 ************************************ 00:19:50.230 END TEST raid_state_function_test 00:19:50.230 ************************************ 00:19:50.230 13:45:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:50.230 13:45:38 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:19:50.230 13:45:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:50.230 13:45:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:50.230 13:45:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:50.230 ************************************ 00:19:50.230 START TEST raid_state_function_test_sb 00:19:50.230 ************************************ 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:50.230 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=510722 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 510722' 00:19:50.231 Process raid pid: 510722 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 510722 /var/tmp/spdk-raid.sock 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 510722 ']' 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:50.231 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:50.231 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:50.231 [2024-07-12 13:45:38.725012] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:19:50.231 [2024-07-12 13:45:38.725075] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:50.490 [2024-07-12 13:45:38.845516] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:50.490 [2024-07-12 13:45:38.949222] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:50.490 [2024-07-12 13:45:39.010085] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:50.490 [2024-07-12 13:45:39.010120] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:51.426 [2024-07-12 13:45:39.918520] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:51.426 [2024-07-12 13:45:39.918563] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:51.426 [2024-07-12 13:45:39.918574] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:51.426 [2024-07-12 13:45:39.918586] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:51.426 [2024-07-12 13:45:39.918595] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:51.426 [2024-07-12 13:45:39.918606] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:51.426 [2024-07-12 13:45:39.918615] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:51.426 [2024-07-12 13:45:39.918626] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.426 13:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:51.685 13:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.685 "name": "Existed_Raid", 00:19:51.685 "uuid": "1e800fa2-eaf3-4926-8177-4208d8c6621a", 00:19:51.685 "strip_size_kb": 64, 00:19:51.685 "state": "configuring", 00:19:51.685 "raid_level": "concat", 00:19:51.685 "superblock": true, 00:19:51.685 "num_base_bdevs": 4, 00:19:51.685 "num_base_bdevs_discovered": 0, 00:19:51.685 "num_base_bdevs_operational": 4, 00:19:51.685 "base_bdevs_list": [ 00:19:51.685 { 00:19:51.685 "name": "BaseBdev1", 00:19:51.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.685 "is_configured": false, 00:19:51.685 "data_offset": 0, 00:19:51.685 "data_size": 0 00:19:51.685 }, 00:19:51.685 { 00:19:51.685 "name": "BaseBdev2", 00:19:51.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.685 "is_configured": false, 00:19:51.685 "data_offset": 0, 00:19:51.685 "data_size": 0 00:19:51.685 }, 00:19:51.685 { 00:19:51.685 "name": "BaseBdev3", 00:19:51.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.685 "is_configured": false, 00:19:51.685 "data_offset": 0, 00:19:51.685 "data_size": 0 00:19:51.685 }, 00:19:51.685 { 00:19:51.685 "name": "BaseBdev4", 00:19:51.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.685 "is_configured": false, 00:19:51.685 "data_offset": 0, 00:19:51.685 "data_size": 0 00:19:51.685 } 00:19:51.685 ] 00:19:51.685 }' 00:19:51.685 13:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.685 13:45:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:52.621 13:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:52.621 [2024-07-12 13:45:41.129563] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:52.621 [2024-07-12 13:45:41.129595] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa9a370 name Existed_Raid, state configuring 00:19:52.621 13:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:52.881 [2024-07-12 13:45:41.374234] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:52.881 [2024-07-12 13:45:41.374265] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:52.881 [2024-07-12 13:45:41.374275] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:52.881 [2024-07-12 13:45:41.374287] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:52.881 [2024-07-12 13:45:41.374295] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:52.881 [2024-07-12 13:45:41.374306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:52.881 [2024-07-12 13:45:41.374315] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:52.881 [2024-07-12 13:45:41.374326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:52.881 13:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:53.140 [2024-07-12 13:45:41.629974] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:53.140 BaseBdev1 00:19:53.140 13:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:53.140 13:45:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:53.140 13:45:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:53.140 13:45:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:53.140 13:45:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:53.140 13:45:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:53.140 13:45:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:53.399 13:45:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:53.659 [ 00:19:53.659 { 00:19:53.659 "name": "BaseBdev1", 00:19:53.659 "aliases": [ 00:19:53.659 "bb9ec164-ced9-41a2-96ca-36abac16d431" 00:19:53.659 ], 00:19:53.659 "product_name": "Malloc disk", 00:19:53.659 "block_size": 512, 00:19:53.659 "num_blocks": 65536, 00:19:53.659 "uuid": "bb9ec164-ced9-41a2-96ca-36abac16d431", 00:19:53.659 "assigned_rate_limits": { 00:19:53.659 "rw_ios_per_sec": 0, 00:19:53.659 "rw_mbytes_per_sec": 0, 00:19:53.659 "r_mbytes_per_sec": 0, 00:19:53.659 "w_mbytes_per_sec": 0 00:19:53.659 }, 00:19:53.659 "claimed": true, 00:19:53.659 "claim_type": "exclusive_write", 00:19:53.659 "zoned": false, 00:19:53.659 "supported_io_types": { 00:19:53.659 "read": true, 00:19:53.659 "write": true, 00:19:53.659 "unmap": true, 00:19:53.659 "flush": true, 00:19:53.659 "reset": true, 00:19:53.659 "nvme_admin": false, 00:19:53.659 "nvme_io": false, 00:19:53.659 "nvme_io_md": false, 00:19:53.659 "write_zeroes": true, 00:19:53.659 "zcopy": true, 00:19:53.659 "get_zone_info": false, 00:19:53.659 "zone_management": false, 00:19:53.659 "zone_append": false, 00:19:53.659 "compare": false, 00:19:53.659 "compare_and_write": false, 00:19:53.659 "abort": true, 00:19:53.659 "seek_hole": false, 00:19:53.659 "seek_data": false, 00:19:53.659 "copy": true, 00:19:53.659 "nvme_iov_md": false 00:19:53.659 }, 00:19:53.659 "memory_domains": [ 00:19:53.659 { 00:19:53.659 "dma_device_id": "system", 00:19:53.659 "dma_device_type": 1 00:19:53.659 }, 00:19:53.659 { 00:19:53.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.659 "dma_device_type": 2 00:19:53.659 } 00:19:53.659 ], 00:19:53.659 "driver_specific": {} 00:19:53.659 } 00:19:53.659 ] 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.659 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:53.918 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.918 "name": "Existed_Raid", 00:19:53.918 "uuid": "fc2a8954-937e-465c-9abb-4ce246caffc4", 00:19:53.918 "strip_size_kb": 64, 00:19:53.918 "state": "configuring", 00:19:53.918 "raid_level": "concat", 00:19:53.918 "superblock": true, 00:19:53.918 "num_base_bdevs": 4, 00:19:53.918 "num_base_bdevs_discovered": 1, 00:19:53.918 "num_base_bdevs_operational": 4, 00:19:53.918 "base_bdevs_list": [ 00:19:53.918 { 00:19:53.918 "name": "BaseBdev1", 00:19:53.918 "uuid": "bb9ec164-ced9-41a2-96ca-36abac16d431", 00:19:53.918 "is_configured": true, 00:19:53.918 "data_offset": 2048, 00:19:53.918 "data_size": 63488 00:19:53.918 }, 00:19:53.918 { 00:19:53.918 "name": "BaseBdev2", 00:19:53.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:53.918 "is_configured": false, 00:19:53.918 "data_offset": 0, 00:19:53.918 "data_size": 0 00:19:53.918 }, 00:19:53.918 { 00:19:53.918 "name": "BaseBdev3", 00:19:53.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:53.918 "is_configured": false, 00:19:53.918 "data_offset": 0, 00:19:53.918 "data_size": 0 00:19:53.918 }, 00:19:53.918 { 00:19:53.918 "name": "BaseBdev4", 00:19:53.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:53.918 "is_configured": false, 00:19:53.918 "data_offset": 0, 00:19:53.918 "data_size": 0 00:19:53.918 } 00:19:53.918 ] 00:19:53.918 }' 00:19:53.918 13:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.918 13:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:54.485 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:54.743 [2024-07-12 13:45:43.274313] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:54.743 [2024-07-12 13:45:43.274357] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa99be0 name Existed_Raid, state configuring 00:19:54.743 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:55.002 [2024-07-12 13:45:43.523016] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:55.002 [2024-07-12 13:45:43.524426] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:55.002 [2024-07-12 13:45:43.524457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:55.002 [2024-07-12 13:45:43.524468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:55.002 [2024-07-12 13:45:43.524480] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:55.002 [2024-07-12 13:45:43.524489] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:55.002 [2024-07-12 13:45:43.524500] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.002 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:55.260 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:55.260 "name": "Existed_Raid", 00:19:55.260 "uuid": "1061b832-5747-4241-91a8-1b65026df3a2", 00:19:55.260 "strip_size_kb": 64, 00:19:55.260 "state": "configuring", 00:19:55.260 "raid_level": "concat", 00:19:55.260 "superblock": true, 00:19:55.260 "num_base_bdevs": 4, 00:19:55.260 "num_base_bdevs_discovered": 1, 00:19:55.260 "num_base_bdevs_operational": 4, 00:19:55.260 "base_bdevs_list": [ 00:19:55.260 { 00:19:55.260 "name": "BaseBdev1", 00:19:55.260 "uuid": "bb9ec164-ced9-41a2-96ca-36abac16d431", 00:19:55.260 "is_configured": true, 00:19:55.260 "data_offset": 2048, 00:19:55.260 "data_size": 63488 00:19:55.260 }, 00:19:55.260 { 00:19:55.260 "name": "BaseBdev2", 00:19:55.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:55.260 "is_configured": false, 00:19:55.260 "data_offset": 0, 00:19:55.260 "data_size": 0 00:19:55.260 }, 00:19:55.260 { 00:19:55.260 "name": "BaseBdev3", 00:19:55.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:55.260 "is_configured": false, 00:19:55.260 "data_offset": 0, 00:19:55.260 "data_size": 0 00:19:55.260 }, 00:19:55.260 { 00:19:55.260 "name": "BaseBdev4", 00:19:55.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:55.260 "is_configured": false, 00:19:55.260 "data_offset": 0, 00:19:55.260 "data_size": 0 00:19:55.260 } 00:19:55.260 ] 00:19:55.260 }' 00:19:55.260 13:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:55.260 13:45:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:56.196 13:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:56.455 [2024-07-12 13:45:44.806964] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:56.455 BaseBdev2 00:19:56.455 13:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:56.455 13:45:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:56.455 13:45:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:56.455 13:45:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:56.455 13:45:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:56.456 13:45:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:56.456 13:45:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:56.715 [ 00:19:56.715 { 00:19:56.715 "name": "BaseBdev2", 00:19:56.715 "aliases": [ 00:19:56.715 "d98c5a46-f1b4-49bc-a36e-8aa64839388d" 00:19:56.715 ], 00:19:56.715 "product_name": "Malloc disk", 00:19:56.715 "block_size": 512, 00:19:56.715 "num_blocks": 65536, 00:19:56.715 "uuid": "d98c5a46-f1b4-49bc-a36e-8aa64839388d", 00:19:56.715 "assigned_rate_limits": { 00:19:56.715 "rw_ios_per_sec": 0, 00:19:56.715 "rw_mbytes_per_sec": 0, 00:19:56.715 "r_mbytes_per_sec": 0, 00:19:56.715 "w_mbytes_per_sec": 0 00:19:56.715 }, 00:19:56.715 "claimed": true, 00:19:56.715 "claim_type": "exclusive_write", 00:19:56.715 "zoned": false, 00:19:56.715 "supported_io_types": { 00:19:56.715 "read": true, 00:19:56.715 "write": true, 00:19:56.715 "unmap": true, 00:19:56.715 "flush": true, 00:19:56.715 "reset": true, 00:19:56.715 "nvme_admin": false, 00:19:56.715 "nvme_io": false, 00:19:56.715 "nvme_io_md": false, 00:19:56.715 "write_zeroes": true, 00:19:56.715 "zcopy": true, 00:19:56.715 "get_zone_info": false, 00:19:56.715 "zone_management": false, 00:19:56.715 "zone_append": false, 00:19:56.715 "compare": false, 00:19:56.715 "compare_and_write": false, 00:19:56.715 "abort": true, 00:19:56.715 "seek_hole": false, 00:19:56.715 "seek_data": false, 00:19:56.715 "copy": true, 00:19:56.715 "nvme_iov_md": false 00:19:56.715 }, 00:19:56.715 "memory_domains": [ 00:19:56.715 { 00:19:56.715 "dma_device_id": "system", 00:19:56.715 "dma_device_type": 1 00:19:56.715 }, 00:19:56.715 { 00:19:56.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.715 "dma_device_type": 2 00:19:56.715 } 00:19:56.715 ], 00:19:56.715 "driver_specific": {} 00:19:56.715 } 00:19:56.715 ] 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.715 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.974 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.974 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:56.974 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.974 "name": "Existed_Raid", 00:19:56.974 "uuid": "1061b832-5747-4241-91a8-1b65026df3a2", 00:19:56.974 "strip_size_kb": 64, 00:19:56.974 "state": "configuring", 00:19:56.974 "raid_level": "concat", 00:19:56.974 "superblock": true, 00:19:56.974 "num_base_bdevs": 4, 00:19:56.974 "num_base_bdevs_discovered": 2, 00:19:56.974 "num_base_bdevs_operational": 4, 00:19:56.974 "base_bdevs_list": [ 00:19:56.974 { 00:19:56.974 "name": "BaseBdev1", 00:19:56.974 "uuid": "bb9ec164-ced9-41a2-96ca-36abac16d431", 00:19:56.974 "is_configured": true, 00:19:56.974 "data_offset": 2048, 00:19:56.974 "data_size": 63488 00:19:56.974 }, 00:19:56.974 { 00:19:56.974 "name": "BaseBdev2", 00:19:56.974 "uuid": "d98c5a46-f1b4-49bc-a36e-8aa64839388d", 00:19:56.974 "is_configured": true, 00:19:56.974 "data_offset": 2048, 00:19:56.974 "data_size": 63488 00:19:56.974 }, 00:19:56.974 { 00:19:56.974 "name": "BaseBdev3", 00:19:56.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.974 "is_configured": false, 00:19:56.974 "data_offset": 0, 00:19:56.974 "data_size": 0 00:19:56.974 }, 00:19:56.974 { 00:19:56.974 "name": "BaseBdev4", 00:19:56.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.974 "is_configured": false, 00:19:56.974 "data_offset": 0, 00:19:56.974 "data_size": 0 00:19:56.974 } 00:19:56.974 ] 00:19:56.974 }' 00:19:56.974 13:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.974 13:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:57.542 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:57.802 [2024-07-12 13:45:46.346566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:57.802 BaseBdev3 00:19:57.802 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:57.802 13:45:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:57.802 13:45:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:57.802 13:45:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:57.802 13:45:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:57.802 13:45:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:57.802 13:45:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:58.061 13:45:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:58.319 [ 00:19:58.319 { 00:19:58.319 "name": "BaseBdev3", 00:19:58.319 "aliases": [ 00:19:58.319 "1446f0ae-8284-4291-a7bd-e723da2bcdf5" 00:19:58.319 ], 00:19:58.319 "product_name": "Malloc disk", 00:19:58.319 "block_size": 512, 00:19:58.319 "num_blocks": 65536, 00:19:58.320 "uuid": "1446f0ae-8284-4291-a7bd-e723da2bcdf5", 00:19:58.320 "assigned_rate_limits": { 00:19:58.320 "rw_ios_per_sec": 0, 00:19:58.320 "rw_mbytes_per_sec": 0, 00:19:58.320 "r_mbytes_per_sec": 0, 00:19:58.320 "w_mbytes_per_sec": 0 00:19:58.320 }, 00:19:58.320 "claimed": true, 00:19:58.320 "claim_type": "exclusive_write", 00:19:58.320 "zoned": false, 00:19:58.320 "supported_io_types": { 00:19:58.320 "read": true, 00:19:58.320 "write": true, 00:19:58.320 "unmap": true, 00:19:58.320 "flush": true, 00:19:58.320 "reset": true, 00:19:58.320 "nvme_admin": false, 00:19:58.320 "nvme_io": false, 00:19:58.320 "nvme_io_md": false, 00:19:58.320 "write_zeroes": true, 00:19:58.320 "zcopy": true, 00:19:58.320 "get_zone_info": false, 00:19:58.320 "zone_management": false, 00:19:58.320 "zone_append": false, 00:19:58.320 "compare": false, 00:19:58.320 "compare_and_write": false, 00:19:58.320 "abort": true, 00:19:58.320 "seek_hole": false, 00:19:58.320 "seek_data": false, 00:19:58.320 "copy": true, 00:19:58.320 "nvme_iov_md": false 00:19:58.320 }, 00:19:58.320 "memory_domains": [ 00:19:58.320 { 00:19:58.320 "dma_device_id": "system", 00:19:58.320 "dma_device_type": 1 00:19:58.320 }, 00:19:58.320 { 00:19:58.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:58.320 "dma_device_type": 2 00:19:58.320 } 00:19:58.320 ], 00:19:58.320 "driver_specific": {} 00:19:58.320 } 00:19:58.320 ] 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.320 13:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:58.579 13:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.579 "name": "Existed_Raid", 00:19:58.579 "uuid": "1061b832-5747-4241-91a8-1b65026df3a2", 00:19:58.579 "strip_size_kb": 64, 00:19:58.579 "state": "configuring", 00:19:58.579 "raid_level": "concat", 00:19:58.579 "superblock": true, 00:19:58.579 "num_base_bdevs": 4, 00:19:58.579 "num_base_bdevs_discovered": 3, 00:19:58.579 "num_base_bdevs_operational": 4, 00:19:58.579 "base_bdevs_list": [ 00:19:58.579 { 00:19:58.579 "name": "BaseBdev1", 00:19:58.579 "uuid": "bb9ec164-ced9-41a2-96ca-36abac16d431", 00:19:58.579 "is_configured": true, 00:19:58.579 "data_offset": 2048, 00:19:58.579 "data_size": 63488 00:19:58.579 }, 00:19:58.579 { 00:19:58.579 "name": "BaseBdev2", 00:19:58.579 "uuid": "d98c5a46-f1b4-49bc-a36e-8aa64839388d", 00:19:58.579 "is_configured": true, 00:19:58.579 "data_offset": 2048, 00:19:58.579 "data_size": 63488 00:19:58.579 }, 00:19:58.579 { 00:19:58.579 "name": "BaseBdev3", 00:19:58.579 "uuid": "1446f0ae-8284-4291-a7bd-e723da2bcdf5", 00:19:58.579 "is_configured": true, 00:19:58.579 "data_offset": 2048, 00:19:58.579 "data_size": 63488 00:19:58.579 }, 00:19:58.579 { 00:19:58.579 "name": "BaseBdev4", 00:19:58.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.579 "is_configured": false, 00:19:58.579 "data_offset": 0, 00:19:58.579 "data_size": 0 00:19:58.579 } 00:19:58.579 ] 00:19:58.579 }' 00:19:58.579 13:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.579 13:45:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:59.516 13:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:59.516 [2024-07-12 13:45:47.962232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:59.516 [2024-07-12 13:45:47.962396] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa9ac40 00:19:59.516 [2024-07-12 13:45:47.962410] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:59.516 [2024-07-12 13:45:47.962579] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa9b8c0 00:19:59.516 [2024-07-12 13:45:47.962698] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa9ac40 00:19:59.516 [2024-07-12 13:45:47.962708] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa9ac40 00:19:59.516 [2024-07-12 13:45:47.962798] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:59.516 BaseBdev4 00:19:59.516 13:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:59.516 13:45:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:59.516 13:45:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:59.516 13:45:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:59.516 13:45:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:59.516 13:45:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:59.516 13:45:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:59.776 13:45:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:00.035 [ 00:20:00.035 { 00:20:00.035 "name": "BaseBdev4", 00:20:00.035 "aliases": [ 00:20:00.035 "58c426f5-3acc-49da-ad73-7f6cf6d49b37" 00:20:00.035 ], 00:20:00.035 "product_name": "Malloc disk", 00:20:00.035 "block_size": 512, 00:20:00.035 "num_blocks": 65536, 00:20:00.035 "uuid": "58c426f5-3acc-49da-ad73-7f6cf6d49b37", 00:20:00.035 "assigned_rate_limits": { 00:20:00.035 "rw_ios_per_sec": 0, 00:20:00.035 "rw_mbytes_per_sec": 0, 00:20:00.035 "r_mbytes_per_sec": 0, 00:20:00.035 "w_mbytes_per_sec": 0 00:20:00.035 }, 00:20:00.035 "claimed": true, 00:20:00.035 "claim_type": "exclusive_write", 00:20:00.035 "zoned": false, 00:20:00.035 "supported_io_types": { 00:20:00.035 "read": true, 00:20:00.035 "write": true, 00:20:00.035 "unmap": true, 00:20:00.035 "flush": true, 00:20:00.035 "reset": true, 00:20:00.035 "nvme_admin": false, 00:20:00.035 "nvme_io": false, 00:20:00.035 "nvme_io_md": false, 00:20:00.035 "write_zeroes": true, 00:20:00.035 "zcopy": true, 00:20:00.035 "get_zone_info": false, 00:20:00.035 "zone_management": false, 00:20:00.035 "zone_append": false, 00:20:00.035 "compare": false, 00:20:00.035 "compare_and_write": false, 00:20:00.035 "abort": true, 00:20:00.035 "seek_hole": false, 00:20:00.035 "seek_data": false, 00:20:00.035 "copy": true, 00:20:00.035 "nvme_iov_md": false 00:20:00.035 }, 00:20:00.035 "memory_domains": [ 00:20:00.035 { 00:20:00.035 "dma_device_id": "system", 00:20:00.035 "dma_device_type": 1 00:20:00.035 }, 00:20:00.035 { 00:20:00.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:00.035 "dma_device_type": 2 00:20:00.035 } 00:20:00.035 ], 00:20:00.035 "driver_specific": {} 00:20:00.035 } 00:20:00.035 ] 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.035 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:00.294 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:00.294 "name": "Existed_Raid", 00:20:00.294 "uuid": "1061b832-5747-4241-91a8-1b65026df3a2", 00:20:00.294 "strip_size_kb": 64, 00:20:00.294 "state": "online", 00:20:00.294 "raid_level": "concat", 00:20:00.294 "superblock": true, 00:20:00.294 "num_base_bdevs": 4, 00:20:00.294 "num_base_bdevs_discovered": 4, 00:20:00.294 "num_base_bdevs_operational": 4, 00:20:00.294 "base_bdevs_list": [ 00:20:00.294 { 00:20:00.294 "name": "BaseBdev1", 00:20:00.294 "uuid": "bb9ec164-ced9-41a2-96ca-36abac16d431", 00:20:00.294 "is_configured": true, 00:20:00.294 "data_offset": 2048, 00:20:00.294 "data_size": 63488 00:20:00.294 }, 00:20:00.294 { 00:20:00.294 "name": "BaseBdev2", 00:20:00.294 "uuid": "d98c5a46-f1b4-49bc-a36e-8aa64839388d", 00:20:00.294 "is_configured": true, 00:20:00.294 "data_offset": 2048, 00:20:00.294 "data_size": 63488 00:20:00.294 }, 00:20:00.294 { 00:20:00.294 "name": "BaseBdev3", 00:20:00.294 "uuid": "1446f0ae-8284-4291-a7bd-e723da2bcdf5", 00:20:00.294 "is_configured": true, 00:20:00.294 "data_offset": 2048, 00:20:00.294 "data_size": 63488 00:20:00.294 }, 00:20:00.294 { 00:20:00.294 "name": "BaseBdev4", 00:20:00.294 "uuid": "58c426f5-3acc-49da-ad73-7f6cf6d49b37", 00:20:00.294 "is_configured": true, 00:20:00.294 "data_offset": 2048, 00:20:00.294 "data_size": 63488 00:20:00.294 } 00:20:00.294 ] 00:20:00.294 }' 00:20:00.294 13:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:00.294 13:45:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:00.861 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:00.861 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:00.861 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:00.861 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:00.861 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:00.861 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:00.861 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:00.861 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:01.120 [2024-07-12 13:45:49.546766] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:01.120 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:01.120 "name": "Existed_Raid", 00:20:01.120 "aliases": [ 00:20:01.120 "1061b832-5747-4241-91a8-1b65026df3a2" 00:20:01.120 ], 00:20:01.120 "product_name": "Raid Volume", 00:20:01.120 "block_size": 512, 00:20:01.120 "num_blocks": 253952, 00:20:01.120 "uuid": "1061b832-5747-4241-91a8-1b65026df3a2", 00:20:01.120 "assigned_rate_limits": { 00:20:01.120 "rw_ios_per_sec": 0, 00:20:01.120 "rw_mbytes_per_sec": 0, 00:20:01.120 "r_mbytes_per_sec": 0, 00:20:01.120 "w_mbytes_per_sec": 0 00:20:01.120 }, 00:20:01.120 "claimed": false, 00:20:01.120 "zoned": false, 00:20:01.120 "supported_io_types": { 00:20:01.120 "read": true, 00:20:01.120 "write": true, 00:20:01.120 "unmap": true, 00:20:01.120 "flush": true, 00:20:01.120 "reset": true, 00:20:01.120 "nvme_admin": false, 00:20:01.120 "nvme_io": false, 00:20:01.120 "nvme_io_md": false, 00:20:01.120 "write_zeroes": true, 00:20:01.120 "zcopy": false, 00:20:01.120 "get_zone_info": false, 00:20:01.120 "zone_management": false, 00:20:01.120 "zone_append": false, 00:20:01.120 "compare": false, 00:20:01.120 "compare_and_write": false, 00:20:01.120 "abort": false, 00:20:01.120 "seek_hole": false, 00:20:01.120 "seek_data": false, 00:20:01.120 "copy": false, 00:20:01.120 "nvme_iov_md": false 00:20:01.120 }, 00:20:01.120 "memory_domains": [ 00:20:01.120 { 00:20:01.120 "dma_device_id": "system", 00:20:01.120 "dma_device_type": 1 00:20:01.120 }, 00:20:01.120 { 00:20:01.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.120 "dma_device_type": 2 00:20:01.120 }, 00:20:01.120 { 00:20:01.120 "dma_device_id": "system", 00:20:01.120 "dma_device_type": 1 00:20:01.120 }, 00:20:01.120 { 00:20:01.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.120 "dma_device_type": 2 00:20:01.120 }, 00:20:01.120 { 00:20:01.120 "dma_device_id": "system", 00:20:01.120 "dma_device_type": 1 00:20:01.120 }, 00:20:01.120 { 00:20:01.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.120 "dma_device_type": 2 00:20:01.120 }, 00:20:01.120 { 00:20:01.120 "dma_device_id": "system", 00:20:01.120 "dma_device_type": 1 00:20:01.120 }, 00:20:01.120 { 00:20:01.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.120 "dma_device_type": 2 00:20:01.120 } 00:20:01.120 ], 00:20:01.120 "driver_specific": { 00:20:01.120 "raid": { 00:20:01.120 "uuid": "1061b832-5747-4241-91a8-1b65026df3a2", 00:20:01.120 "strip_size_kb": 64, 00:20:01.120 "state": "online", 00:20:01.120 "raid_level": "concat", 00:20:01.120 "superblock": true, 00:20:01.120 "num_base_bdevs": 4, 00:20:01.120 "num_base_bdevs_discovered": 4, 00:20:01.120 "num_base_bdevs_operational": 4, 00:20:01.120 "base_bdevs_list": [ 00:20:01.120 { 00:20:01.120 "name": "BaseBdev1", 00:20:01.120 "uuid": "bb9ec164-ced9-41a2-96ca-36abac16d431", 00:20:01.120 "is_configured": true, 00:20:01.120 "data_offset": 2048, 00:20:01.120 "data_size": 63488 00:20:01.120 }, 00:20:01.120 { 00:20:01.120 "name": "BaseBdev2", 00:20:01.120 "uuid": "d98c5a46-f1b4-49bc-a36e-8aa64839388d", 00:20:01.120 "is_configured": true, 00:20:01.120 "data_offset": 2048, 00:20:01.120 "data_size": 63488 00:20:01.120 }, 00:20:01.120 { 00:20:01.120 "name": "BaseBdev3", 00:20:01.120 "uuid": "1446f0ae-8284-4291-a7bd-e723da2bcdf5", 00:20:01.120 "is_configured": true, 00:20:01.120 "data_offset": 2048, 00:20:01.120 "data_size": 63488 00:20:01.120 }, 00:20:01.120 { 00:20:01.120 "name": "BaseBdev4", 00:20:01.120 "uuid": "58c426f5-3acc-49da-ad73-7f6cf6d49b37", 00:20:01.120 "is_configured": true, 00:20:01.120 "data_offset": 2048, 00:20:01.120 "data_size": 63488 00:20:01.120 } 00:20:01.120 ] 00:20:01.120 } 00:20:01.120 } 00:20:01.120 }' 00:20:01.120 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:01.120 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:01.120 BaseBdev2 00:20:01.120 BaseBdev3 00:20:01.120 BaseBdev4' 00:20:01.120 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:01.120 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:01.120 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:01.379 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:01.379 "name": "BaseBdev1", 00:20:01.379 "aliases": [ 00:20:01.379 "bb9ec164-ced9-41a2-96ca-36abac16d431" 00:20:01.379 ], 00:20:01.379 "product_name": "Malloc disk", 00:20:01.379 "block_size": 512, 00:20:01.379 "num_blocks": 65536, 00:20:01.379 "uuid": "bb9ec164-ced9-41a2-96ca-36abac16d431", 00:20:01.379 "assigned_rate_limits": { 00:20:01.379 "rw_ios_per_sec": 0, 00:20:01.379 "rw_mbytes_per_sec": 0, 00:20:01.379 "r_mbytes_per_sec": 0, 00:20:01.379 "w_mbytes_per_sec": 0 00:20:01.379 }, 00:20:01.379 "claimed": true, 00:20:01.379 "claim_type": "exclusive_write", 00:20:01.379 "zoned": false, 00:20:01.379 "supported_io_types": { 00:20:01.379 "read": true, 00:20:01.379 "write": true, 00:20:01.379 "unmap": true, 00:20:01.379 "flush": true, 00:20:01.379 "reset": true, 00:20:01.379 "nvme_admin": false, 00:20:01.379 "nvme_io": false, 00:20:01.379 "nvme_io_md": false, 00:20:01.379 "write_zeroes": true, 00:20:01.379 "zcopy": true, 00:20:01.379 "get_zone_info": false, 00:20:01.380 "zone_management": false, 00:20:01.380 "zone_append": false, 00:20:01.380 "compare": false, 00:20:01.380 "compare_and_write": false, 00:20:01.380 "abort": true, 00:20:01.380 "seek_hole": false, 00:20:01.380 "seek_data": false, 00:20:01.380 "copy": true, 00:20:01.380 "nvme_iov_md": false 00:20:01.380 }, 00:20:01.380 "memory_domains": [ 00:20:01.380 { 00:20:01.380 "dma_device_id": "system", 00:20:01.380 "dma_device_type": 1 00:20:01.380 }, 00:20:01.380 { 00:20:01.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.380 "dma_device_type": 2 00:20:01.380 } 00:20:01.380 ], 00:20:01.380 "driver_specific": {} 00:20:01.380 }' 00:20:01.380 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:01.380 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:01.380 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:01.637 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:01.637 13:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:01.637 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:01.637 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:01.637 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:01.637 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:01.637 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:01.638 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:01.638 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:01.638 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:01.638 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:01.638 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:01.896 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:01.896 "name": "BaseBdev2", 00:20:01.896 "aliases": [ 00:20:01.896 "d98c5a46-f1b4-49bc-a36e-8aa64839388d" 00:20:01.896 ], 00:20:01.896 "product_name": "Malloc disk", 00:20:01.896 "block_size": 512, 00:20:01.896 "num_blocks": 65536, 00:20:01.896 "uuid": "d98c5a46-f1b4-49bc-a36e-8aa64839388d", 00:20:01.896 "assigned_rate_limits": { 00:20:01.896 "rw_ios_per_sec": 0, 00:20:01.896 "rw_mbytes_per_sec": 0, 00:20:01.896 "r_mbytes_per_sec": 0, 00:20:01.896 "w_mbytes_per_sec": 0 00:20:01.896 }, 00:20:01.896 "claimed": true, 00:20:01.896 "claim_type": "exclusive_write", 00:20:01.896 "zoned": false, 00:20:01.896 "supported_io_types": { 00:20:01.896 "read": true, 00:20:01.896 "write": true, 00:20:01.896 "unmap": true, 00:20:01.896 "flush": true, 00:20:01.896 "reset": true, 00:20:01.896 "nvme_admin": false, 00:20:01.896 "nvme_io": false, 00:20:01.896 "nvme_io_md": false, 00:20:01.896 "write_zeroes": true, 00:20:01.896 "zcopy": true, 00:20:01.896 "get_zone_info": false, 00:20:01.896 "zone_management": false, 00:20:01.896 "zone_append": false, 00:20:01.896 "compare": false, 00:20:01.896 "compare_and_write": false, 00:20:01.896 "abort": true, 00:20:01.896 "seek_hole": false, 00:20:01.896 "seek_data": false, 00:20:01.896 "copy": true, 00:20:01.896 "nvme_iov_md": false 00:20:01.896 }, 00:20:01.896 "memory_domains": [ 00:20:01.896 { 00:20:01.896 "dma_device_id": "system", 00:20:01.896 "dma_device_type": 1 00:20:01.896 }, 00:20:01.896 { 00:20:01.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.896 "dma_device_type": 2 00:20:01.896 } 00:20:01.896 ], 00:20:01.896 "driver_specific": {} 00:20:01.896 }' 00:20:01.896 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:02.154 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:02.154 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:02.154 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:02.154 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:02.154 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:02.154 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:02.154 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:02.154 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:02.413 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:02.413 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:02.413 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:02.413 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:02.413 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:02.413 13:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:02.672 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:02.672 "name": "BaseBdev3", 00:20:02.672 "aliases": [ 00:20:02.672 "1446f0ae-8284-4291-a7bd-e723da2bcdf5" 00:20:02.672 ], 00:20:02.672 "product_name": "Malloc disk", 00:20:02.672 "block_size": 512, 00:20:02.672 "num_blocks": 65536, 00:20:02.672 "uuid": "1446f0ae-8284-4291-a7bd-e723da2bcdf5", 00:20:02.672 "assigned_rate_limits": { 00:20:02.672 "rw_ios_per_sec": 0, 00:20:02.672 "rw_mbytes_per_sec": 0, 00:20:02.672 "r_mbytes_per_sec": 0, 00:20:02.672 "w_mbytes_per_sec": 0 00:20:02.672 }, 00:20:02.672 "claimed": true, 00:20:02.672 "claim_type": "exclusive_write", 00:20:02.672 "zoned": false, 00:20:02.672 "supported_io_types": { 00:20:02.672 "read": true, 00:20:02.672 "write": true, 00:20:02.672 "unmap": true, 00:20:02.672 "flush": true, 00:20:02.672 "reset": true, 00:20:02.672 "nvme_admin": false, 00:20:02.672 "nvme_io": false, 00:20:02.672 "nvme_io_md": false, 00:20:02.672 "write_zeroes": true, 00:20:02.672 "zcopy": true, 00:20:02.672 "get_zone_info": false, 00:20:02.672 "zone_management": false, 00:20:02.672 "zone_append": false, 00:20:02.672 "compare": false, 00:20:02.672 "compare_and_write": false, 00:20:02.672 "abort": true, 00:20:02.672 "seek_hole": false, 00:20:02.672 "seek_data": false, 00:20:02.672 "copy": true, 00:20:02.672 "nvme_iov_md": false 00:20:02.672 }, 00:20:02.672 "memory_domains": [ 00:20:02.672 { 00:20:02.672 "dma_device_id": "system", 00:20:02.672 "dma_device_type": 1 00:20:02.672 }, 00:20:02.672 { 00:20:02.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:02.672 "dma_device_type": 2 00:20:02.672 } 00:20:02.672 ], 00:20:02.672 "driver_specific": {} 00:20:02.672 }' 00:20:02.672 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:02.672 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:02.672 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:02.672 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:02.672 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:02.931 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:02.931 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:02.931 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:02.931 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:02.931 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:02.931 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:02.931 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:02.931 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:02.931 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:02.931 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:03.189 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:03.189 "name": "BaseBdev4", 00:20:03.189 "aliases": [ 00:20:03.189 "58c426f5-3acc-49da-ad73-7f6cf6d49b37" 00:20:03.189 ], 00:20:03.189 "product_name": "Malloc disk", 00:20:03.189 "block_size": 512, 00:20:03.189 "num_blocks": 65536, 00:20:03.189 "uuid": "58c426f5-3acc-49da-ad73-7f6cf6d49b37", 00:20:03.189 "assigned_rate_limits": { 00:20:03.189 "rw_ios_per_sec": 0, 00:20:03.189 "rw_mbytes_per_sec": 0, 00:20:03.189 "r_mbytes_per_sec": 0, 00:20:03.189 "w_mbytes_per_sec": 0 00:20:03.189 }, 00:20:03.189 "claimed": true, 00:20:03.189 "claim_type": "exclusive_write", 00:20:03.189 "zoned": false, 00:20:03.189 "supported_io_types": { 00:20:03.189 "read": true, 00:20:03.189 "write": true, 00:20:03.189 "unmap": true, 00:20:03.189 "flush": true, 00:20:03.189 "reset": true, 00:20:03.189 "nvme_admin": false, 00:20:03.189 "nvme_io": false, 00:20:03.189 "nvme_io_md": false, 00:20:03.189 "write_zeroes": true, 00:20:03.189 "zcopy": true, 00:20:03.189 "get_zone_info": false, 00:20:03.189 "zone_management": false, 00:20:03.189 "zone_append": false, 00:20:03.189 "compare": false, 00:20:03.189 "compare_and_write": false, 00:20:03.189 "abort": true, 00:20:03.189 "seek_hole": false, 00:20:03.189 "seek_data": false, 00:20:03.189 "copy": true, 00:20:03.189 "nvme_iov_md": false 00:20:03.189 }, 00:20:03.189 "memory_domains": [ 00:20:03.189 { 00:20:03.189 "dma_device_id": "system", 00:20:03.189 "dma_device_type": 1 00:20:03.189 }, 00:20:03.189 { 00:20:03.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.189 "dma_device_type": 2 00:20:03.189 } 00:20:03.189 ], 00:20:03.189 "driver_specific": {} 00:20:03.189 }' 00:20:03.189 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:03.189 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:03.189 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:03.189 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:03.190 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:03.448 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:03.449 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:03.449 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:03.449 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:03.449 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:03.449 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:03.449 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:03.449 13:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:03.708 [2024-07-12 13:45:52.197576] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:03.708 [2024-07-12 13:45:52.197604] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:03.708 [2024-07-12 13:45:52.197654] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.708 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:03.966 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.966 "name": "Existed_Raid", 00:20:03.966 "uuid": "1061b832-5747-4241-91a8-1b65026df3a2", 00:20:03.966 "strip_size_kb": 64, 00:20:03.966 "state": "offline", 00:20:03.966 "raid_level": "concat", 00:20:03.966 "superblock": true, 00:20:03.966 "num_base_bdevs": 4, 00:20:03.966 "num_base_bdevs_discovered": 3, 00:20:03.966 "num_base_bdevs_operational": 3, 00:20:03.966 "base_bdevs_list": [ 00:20:03.966 { 00:20:03.966 "name": null, 00:20:03.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.966 "is_configured": false, 00:20:03.966 "data_offset": 2048, 00:20:03.966 "data_size": 63488 00:20:03.966 }, 00:20:03.966 { 00:20:03.966 "name": "BaseBdev2", 00:20:03.966 "uuid": "d98c5a46-f1b4-49bc-a36e-8aa64839388d", 00:20:03.966 "is_configured": true, 00:20:03.966 "data_offset": 2048, 00:20:03.966 "data_size": 63488 00:20:03.966 }, 00:20:03.966 { 00:20:03.966 "name": "BaseBdev3", 00:20:03.966 "uuid": "1446f0ae-8284-4291-a7bd-e723da2bcdf5", 00:20:03.966 "is_configured": true, 00:20:03.967 "data_offset": 2048, 00:20:03.967 "data_size": 63488 00:20:03.967 }, 00:20:03.967 { 00:20:03.967 "name": "BaseBdev4", 00:20:03.967 "uuid": "58c426f5-3acc-49da-ad73-7f6cf6d49b37", 00:20:03.967 "is_configured": true, 00:20:03.967 "data_offset": 2048, 00:20:03.967 "data_size": 63488 00:20:03.967 } 00:20:03.967 ] 00:20:03.967 }' 00:20:03.967 13:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.967 13:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:04.531 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:04.531 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:04.531 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.531 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:04.788 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:04.788 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:04.788 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:05.047 [2024-07-12 13:45:53.555081] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:05.047 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:05.047 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:05.047 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.047 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:05.305 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:05.305 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:05.305 13:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:05.564 [2024-07-12 13:45:54.064722] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:05.564 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:05.564 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:05.564 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:05.564 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.822 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:05.822 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:05.822 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:06.080 [2024-07-12 13:45:54.580606] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:06.080 [2024-07-12 13:45:54.580647] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa9ac40 name Existed_Raid, state offline 00:20:06.080 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:06.080 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:06.080 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.080 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:06.338 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:06.338 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:06.338 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:06.338 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:06.338 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:06.338 13:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:06.904 BaseBdev2 00:20:06.904 13:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:06.904 13:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:06.904 13:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:06.904 13:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:06.904 13:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:06.904 13:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:06.904 13:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:07.162 13:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:07.727 [ 00:20:07.727 { 00:20:07.727 "name": "BaseBdev2", 00:20:07.727 "aliases": [ 00:20:07.727 "a2f39964-ac0d-48cc-8e5c-c210c72b6c53" 00:20:07.727 ], 00:20:07.727 "product_name": "Malloc disk", 00:20:07.727 "block_size": 512, 00:20:07.727 "num_blocks": 65536, 00:20:07.727 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:07.727 "assigned_rate_limits": { 00:20:07.727 "rw_ios_per_sec": 0, 00:20:07.727 "rw_mbytes_per_sec": 0, 00:20:07.727 "r_mbytes_per_sec": 0, 00:20:07.727 "w_mbytes_per_sec": 0 00:20:07.727 }, 00:20:07.727 "claimed": false, 00:20:07.727 "zoned": false, 00:20:07.727 "supported_io_types": { 00:20:07.727 "read": true, 00:20:07.727 "write": true, 00:20:07.727 "unmap": true, 00:20:07.727 "flush": true, 00:20:07.727 "reset": true, 00:20:07.727 "nvme_admin": false, 00:20:07.727 "nvme_io": false, 00:20:07.727 "nvme_io_md": false, 00:20:07.727 "write_zeroes": true, 00:20:07.727 "zcopy": true, 00:20:07.727 "get_zone_info": false, 00:20:07.727 "zone_management": false, 00:20:07.727 "zone_append": false, 00:20:07.727 "compare": false, 00:20:07.727 "compare_and_write": false, 00:20:07.727 "abort": true, 00:20:07.727 "seek_hole": false, 00:20:07.727 "seek_data": false, 00:20:07.727 "copy": true, 00:20:07.727 "nvme_iov_md": false 00:20:07.727 }, 00:20:07.727 "memory_domains": [ 00:20:07.727 { 00:20:07.727 "dma_device_id": "system", 00:20:07.727 "dma_device_type": 1 00:20:07.727 }, 00:20:07.727 { 00:20:07.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:07.727 "dma_device_type": 2 00:20:07.727 } 00:20:07.727 ], 00:20:07.727 "driver_specific": {} 00:20:07.727 } 00:20:07.727 ] 00:20:07.727 13:45:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:07.727 13:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:07.727 13:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:07.727 13:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:07.985 BaseBdev3 00:20:07.985 13:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:07.985 13:45:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:07.985 13:45:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:07.985 13:45:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:07.985 13:45:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:07.985 13:45:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:07.985 13:45:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:08.552 13:45:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:08.552 [ 00:20:08.552 { 00:20:08.552 "name": "BaseBdev3", 00:20:08.552 "aliases": [ 00:20:08.552 "959506ab-2de0-48ee-bcc7-287e5de2e76a" 00:20:08.552 ], 00:20:08.552 "product_name": "Malloc disk", 00:20:08.552 "block_size": 512, 00:20:08.552 "num_blocks": 65536, 00:20:08.552 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:08.552 "assigned_rate_limits": { 00:20:08.552 "rw_ios_per_sec": 0, 00:20:08.552 "rw_mbytes_per_sec": 0, 00:20:08.552 "r_mbytes_per_sec": 0, 00:20:08.552 "w_mbytes_per_sec": 0 00:20:08.552 }, 00:20:08.552 "claimed": false, 00:20:08.552 "zoned": false, 00:20:08.552 "supported_io_types": { 00:20:08.552 "read": true, 00:20:08.552 "write": true, 00:20:08.552 "unmap": true, 00:20:08.552 "flush": true, 00:20:08.552 "reset": true, 00:20:08.552 "nvme_admin": false, 00:20:08.552 "nvme_io": false, 00:20:08.552 "nvme_io_md": false, 00:20:08.552 "write_zeroes": true, 00:20:08.552 "zcopy": true, 00:20:08.552 "get_zone_info": false, 00:20:08.552 "zone_management": false, 00:20:08.552 "zone_append": false, 00:20:08.552 "compare": false, 00:20:08.552 "compare_and_write": false, 00:20:08.552 "abort": true, 00:20:08.552 "seek_hole": false, 00:20:08.552 "seek_data": false, 00:20:08.552 "copy": true, 00:20:08.552 "nvme_iov_md": false 00:20:08.552 }, 00:20:08.552 "memory_domains": [ 00:20:08.552 { 00:20:08.552 "dma_device_id": "system", 00:20:08.552 "dma_device_type": 1 00:20:08.552 }, 00:20:08.552 { 00:20:08.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.552 "dma_device_type": 2 00:20:08.552 } 00:20:08.552 ], 00:20:08.552 "driver_specific": {} 00:20:08.552 } 00:20:08.552 ] 00:20:08.810 13:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:08.810 13:45:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:08.810 13:45:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:08.810 13:45:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:09.069 BaseBdev4 00:20:09.327 13:45:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:09.327 13:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:09.327 13:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:09.327 13:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:09.327 13:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:09.327 13:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:09.327 13:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:09.327 13:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:09.893 [ 00:20:09.893 { 00:20:09.893 "name": "BaseBdev4", 00:20:09.893 "aliases": [ 00:20:09.893 "a5f4466b-7322-49bf-b540-03700c9f64e3" 00:20:09.893 ], 00:20:09.893 "product_name": "Malloc disk", 00:20:09.893 "block_size": 512, 00:20:09.893 "num_blocks": 65536, 00:20:09.893 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:09.893 "assigned_rate_limits": { 00:20:09.893 "rw_ios_per_sec": 0, 00:20:09.893 "rw_mbytes_per_sec": 0, 00:20:09.893 "r_mbytes_per_sec": 0, 00:20:09.893 "w_mbytes_per_sec": 0 00:20:09.893 }, 00:20:09.893 "claimed": false, 00:20:09.893 "zoned": false, 00:20:09.893 "supported_io_types": { 00:20:09.893 "read": true, 00:20:09.893 "write": true, 00:20:09.893 "unmap": true, 00:20:09.893 "flush": true, 00:20:09.893 "reset": true, 00:20:09.893 "nvme_admin": false, 00:20:09.893 "nvme_io": false, 00:20:09.893 "nvme_io_md": false, 00:20:09.893 "write_zeroes": true, 00:20:09.893 "zcopy": true, 00:20:09.893 "get_zone_info": false, 00:20:09.893 "zone_management": false, 00:20:09.893 "zone_append": false, 00:20:09.893 "compare": false, 00:20:09.893 "compare_and_write": false, 00:20:09.893 "abort": true, 00:20:09.893 "seek_hole": false, 00:20:09.893 "seek_data": false, 00:20:09.893 "copy": true, 00:20:09.893 "nvme_iov_md": false 00:20:09.893 }, 00:20:09.893 "memory_domains": [ 00:20:09.893 { 00:20:09.893 "dma_device_id": "system", 00:20:09.893 "dma_device_type": 1 00:20:09.893 }, 00:20:09.893 { 00:20:09.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.893 "dma_device_type": 2 00:20:09.893 } 00:20:09.893 ], 00:20:09.893 "driver_specific": {} 00:20:09.893 } 00:20:09.893 ] 00:20:09.893 13:45:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:09.893 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:09.893 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:09.893 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:10.152 [2024-07-12 13:45:58.640573] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:10.152 [2024-07-12 13:45:58.640614] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:10.152 [2024-07-12 13:45:58.640633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:10.152 [2024-07-12 13:45:58.641990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:10.152 [2024-07-12 13:45:58.642031] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.152 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:10.410 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.410 "name": "Existed_Raid", 00:20:10.410 "uuid": "31046b93-ece0-4a14-af05-5285bfe450ef", 00:20:10.410 "strip_size_kb": 64, 00:20:10.410 "state": "configuring", 00:20:10.410 "raid_level": "concat", 00:20:10.410 "superblock": true, 00:20:10.410 "num_base_bdevs": 4, 00:20:10.410 "num_base_bdevs_discovered": 3, 00:20:10.410 "num_base_bdevs_operational": 4, 00:20:10.410 "base_bdevs_list": [ 00:20:10.410 { 00:20:10.410 "name": "BaseBdev1", 00:20:10.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:10.410 "is_configured": false, 00:20:10.410 "data_offset": 0, 00:20:10.410 "data_size": 0 00:20:10.410 }, 00:20:10.410 { 00:20:10.410 "name": "BaseBdev2", 00:20:10.410 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:10.410 "is_configured": true, 00:20:10.410 "data_offset": 2048, 00:20:10.410 "data_size": 63488 00:20:10.410 }, 00:20:10.410 { 00:20:10.410 "name": "BaseBdev3", 00:20:10.410 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:10.410 "is_configured": true, 00:20:10.410 "data_offset": 2048, 00:20:10.410 "data_size": 63488 00:20:10.410 }, 00:20:10.410 { 00:20:10.410 "name": "BaseBdev4", 00:20:10.410 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:10.410 "is_configured": true, 00:20:10.410 "data_offset": 2048, 00:20:10.410 "data_size": 63488 00:20:10.410 } 00:20:10.410 ] 00:20:10.410 }' 00:20:10.410 13:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.410 13:45:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:10.977 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:11.236 [2024-07-12 13:45:59.739450] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:11.236 13:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.494 13:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:11.494 "name": "Existed_Raid", 00:20:11.494 "uuid": "31046b93-ece0-4a14-af05-5285bfe450ef", 00:20:11.494 "strip_size_kb": 64, 00:20:11.494 "state": "configuring", 00:20:11.494 "raid_level": "concat", 00:20:11.494 "superblock": true, 00:20:11.494 "num_base_bdevs": 4, 00:20:11.494 "num_base_bdevs_discovered": 2, 00:20:11.494 "num_base_bdevs_operational": 4, 00:20:11.494 "base_bdevs_list": [ 00:20:11.494 { 00:20:11.494 "name": "BaseBdev1", 00:20:11.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.495 "is_configured": false, 00:20:11.495 "data_offset": 0, 00:20:11.495 "data_size": 0 00:20:11.495 }, 00:20:11.495 { 00:20:11.495 "name": null, 00:20:11.495 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:11.495 "is_configured": false, 00:20:11.495 "data_offset": 2048, 00:20:11.495 "data_size": 63488 00:20:11.495 }, 00:20:11.495 { 00:20:11.495 "name": "BaseBdev3", 00:20:11.495 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:11.495 "is_configured": true, 00:20:11.495 "data_offset": 2048, 00:20:11.495 "data_size": 63488 00:20:11.495 }, 00:20:11.495 { 00:20:11.495 "name": "BaseBdev4", 00:20:11.495 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:11.495 "is_configured": true, 00:20:11.495 "data_offset": 2048, 00:20:11.495 "data_size": 63488 00:20:11.495 } 00:20:11.495 ] 00:20:11.495 }' 00:20:11.495 13:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:11.495 13:46:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:12.068 13:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.068 13:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:12.326 13:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:12.326 13:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:12.895 [2024-07-12 13:46:01.376403] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:12.895 BaseBdev1 00:20:12.895 13:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:12.895 13:46:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:12.895 13:46:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:12.895 13:46:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:12.895 13:46:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:12.895 13:46:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:12.895 13:46:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:13.461 13:46:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:14.028 [ 00:20:14.028 { 00:20:14.028 "name": "BaseBdev1", 00:20:14.028 "aliases": [ 00:20:14.028 "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86" 00:20:14.028 ], 00:20:14.028 "product_name": "Malloc disk", 00:20:14.028 "block_size": 512, 00:20:14.028 "num_blocks": 65536, 00:20:14.028 "uuid": "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86", 00:20:14.028 "assigned_rate_limits": { 00:20:14.028 "rw_ios_per_sec": 0, 00:20:14.028 "rw_mbytes_per_sec": 0, 00:20:14.028 "r_mbytes_per_sec": 0, 00:20:14.028 "w_mbytes_per_sec": 0 00:20:14.028 }, 00:20:14.028 "claimed": true, 00:20:14.028 "claim_type": "exclusive_write", 00:20:14.028 "zoned": false, 00:20:14.028 "supported_io_types": { 00:20:14.028 "read": true, 00:20:14.028 "write": true, 00:20:14.028 "unmap": true, 00:20:14.028 "flush": true, 00:20:14.028 "reset": true, 00:20:14.028 "nvme_admin": false, 00:20:14.028 "nvme_io": false, 00:20:14.028 "nvme_io_md": false, 00:20:14.028 "write_zeroes": true, 00:20:14.028 "zcopy": true, 00:20:14.028 "get_zone_info": false, 00:20:14.028 "zone_management": false, 00:20:14.028 "zone_append": false, 00:20:14.028 "compare": false, 00:20:14.028 "compare_and_write": false, 00:20:14.028 "abort": true, 00:20:14.028 "seek_hole": false, 00:20:14.028 "seek_data": false, 00:20:14.028 "copy": true, 00:20:14.028 "nvme_iov_md": false 00:20:14.028 }, 00:20:14.028 "memory_domains": [ 00:20:14.028 { 00:20:14.028 "dma_device_id": "system", 00:20:14.028 "dma_device_type": 1 00:20:14.028 }, 00:20:14.028 { 00:20:14.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.028 "dma_device_type": 2 00:20:14.028 } 00:20:14.028 ], 00:20:14.028 "driver_specific": {} 00:20:14.028 } 00:20:14.028 ] 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.028 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:14.288 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:14.288 "name": "Existed_Raid", 00:20:14.288 "uuid": "31046b93-ece0-4a14-af05-5285bfe450ef", 00:20:14.288 "strip_size_kb": 64, 00:20:14.288 "state": "configuring", 00:20:14.288 "raid_level": "concat", 00:20:14.288 "superblock": true, 00:20:14.288 "num_base_bdevs": 4, 00:20:14.288 "num_base_bdevs_discovered": 3, 00:20:14.288 "num_base_bdevs_operational": 4, 00:20:14.288 "base_bdevs_list": [ 00:20:14.288 { 00:20:14.288 "name": "BaseBdev1", 00:20:14.288 "uuid": "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86", 00:20:14.288 "is_configured": true, 00:20:14.288 "data_offset": 2048, 00:20:14.288 "data_size": 63488 00:20:14.288 }, 00:20:14.288 { 00:20:14.288 "name": null, 00:20:14.288 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:14.288 "is_configured": false, 00:20:14.288 "data_offset": 2048, 00:20:14.288 "data_size": 63488 00:20:14.288 }, 00:20:14.288 { 00:20:14.288 "name": "BaseBdev3", 00:20:14.288 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:14.288 "is_configured": true, 00:20:14.288 "data_offset": 2048, 00:20:14.288 "data_size": 63488 00:20:14.288 }, 00:20:14.288 { 00:20:14.288 "name": "BaseBdev4", 00:20:14.288 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:14.288 "is_configured": true, 00:20:14.288 "data_offset": 2048, 00:20:14.288 "data_size": 63488 00:20:14.288 } 00:20:14.288 ] 00:20:14.288 }' 00:20:14.288 13:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:14.288 13:46:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:14.855 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.855 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:15.114 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:15.114 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:15.114 [2024-07-12 13:46:03.674546] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:15.372 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:15.372 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:15.372 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:15.372 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:15.372 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:15.372 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:15.372 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:15.372 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:15.372 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:15.373 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:15.373 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.373 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:15.631 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:15.631 "name": "Existed_Raid", 00:20:15.631 "uuid": "31046b93-ece0-4a14-af05-5285bfe450ef", 00:20:15.631 "strip_size_kb": 64, 00:20:15.631 "state": "configuring", 00:20:15.631 "raid_level": "concat", 00:20:15.631 "superblock": true, 00:20:15.631 "num_base_bdevs": 4, 00:20:15.631 "num_base_bdevs_discovered": 2, 00:20:15.631 "num_base_bdevs_operational": 4, 00:20:15.631 "base_bdevs_list": [ 00:20:15.631 { 00:20:15.631 "name": "BaseBdev1", 00:20:15.631 "uuid": "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86", 00:20:15.631 "is_configured": true, 00:20:15.631 "data_offset": 2048, 00:20:15.631 "data_size": 63488 00:20:15.631 }, 00:20:15.631 { 00:20:15.631 "name": null, 00:20:15.631 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:15.631 "is_configured": false, 00:20:15.631 "data_offset": 2048, 00:20:15.631 "data_size": 63488 00:20:15.631 }, 00:20:15.631 { 00:20:15.631 "name": null, 00:20:15.631 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:15.631 "is_configured": false, 00:20:15.631 "data_offset": 2048, 00:20:15.631 "data_size": 63488 00:20:15.631 }, 00:20:15.631 { 00:20:15.631 "name": "BaseBdev4", 00:20:15.631 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:15.631 "is_configured": true, 00:20:15.631 "data_offset": 2048, 00:20:15.631 "data_size": 63488 00:20:15.631 } 00:20:15.631 ] 00:20:15.631 }' 00:20:15.631 13:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:15.631 13:46:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:16.199 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.199 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:16.199 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:16.199 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:16.458 [2024-07-12 13:46:04.958125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.458 13:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:16.717 13:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:16.717 "name": "Existed_Raid", 00:20:16.717 "uuid": "31046b93-ece0-4a14-af05-5285bfe450ef", 00:20:16.717 "strip_size_kb": 64, 00:20:16.717 "state": "configuring", 00:20:16.717 "raid_level": "concat", 00:20:16.717 "superblock": true, 00:20:16.717 "num_base_bdevs": 4, 00:20:16.718 "num_base_bdevs_discovered": 3, 00:20:16.718 "num_base_bdevs_operational": 4, 00:20:16.718 "base_bdevs_list": [ 00:20:16.718 { 00:20:16.718 "name": "BaseBdev1", 00:20:16.718 "uuid": "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86", 00:20:16.718 "is_configured": true, 00:20:16.718 "data_offset": 2048, 00:20:16.718 "data_size": 63488 00:20:16.718 }, 00:20:16.718 { 00:20:16.718 "name": null, 00:20:16.718 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:16.718 "is_configured": false, 00:20:16.718 "data_offset": 2048, 00:20:16.718 "data_size": 63488 00:20:16.718 }, 00:20:16.718 { 00:20:16.718 "name": "BaseBdev3", 00:20:16.718 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:16.718 "is_configured": true, 00:20:16.718 "data_offset": 2048, 00:20:16.718 "data_size": 63488 00:20:16.718 }, 00:20:16.718 { 00:20:16.718 "name": "BaseBdev4", 00:20:16.718 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:16.718 "is_configured": true, 00:20:16.718 "data_offset": 2048, 00:20:16.718 "data_size": 63488 00:20:16.718 } 00:20:16.718 ] 00:20:16.718 }' 00:20:16.718 13:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:16.718 13:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:17.286 13:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.286 13:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:17.544 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:17.544 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:17.804 [2024-07-12 13:46:06.249567] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.804 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:18.063 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:18.063 "name": "Existed_Raid", 00:20:18.063 "uuid": "31046b93-ece0-4a14-af05-5285bfe450ef", 00:20:18.063 "strip_size_kb": 64, 00:20:18.063 "state": "configuring", 00:20:18.063 "raid_level": "concat", 00:20:18.063 "superblock": true, 00:20:18.063 "num_base_bdevs": 4, 00:20:18.063 "num_base_bdevs_discovered": 2, 00:20:18.063 "num_base_bdevs_operational": 4, 00:20:18.063 "base_bdevs_list": [ 00:20:18.063 { 00:20:18.063 "name": null, 00:20:18.063 "uuid": "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86", 00:20:18.063 "is_configured": false, 00:20:18.063 "data_offset": 2048, 00:20:18.063 "data_size": 63488 00:20:18.063 }, 00:20:18.063 { 00:20:18.063 "name": null, 00:20:18.063 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:18.063 "is_configured": false, 00:20:18.063 "data_offset": 2048, 00:20:18.063 "data_size": 63488 00:20:18.063 }, 00:20:18.063 { 00:20:18.063 "name": "BaseBdev3", 00:20:18.063 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:18.063 "is_configured": true, 00:20:18.063 "data_offset": 2048, 00:20:18.063 "data_size": 63488 00:20:18.063 }, 00:20:18.063 { 00:20:18.063 "name": "BaseBdev4", 00:20:18.063 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:18.063 "is_configured": true, 00:20:18.063 "data_offset": 2048, 00:20:18.063 "data_size": 63488 00:20:18.063 } 00:20:18.063 ] 00:20:18.063 }' 00:20:18.063 13:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:18.063 13:46:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:18.631 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.631 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:18.890 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:18.890 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:19.150 [2024-07-12 13:46:07.599630] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.150 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:19.409 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:19.409 "name": "Existed_Raid", 00:20:19.409 "uuid": "31046b93-ece0-4a14-af05-5285bfe450ef", 00:20:19.409 "strip_size_kb": 64, 00:20:19.409 "state": "configuring", 00:20:19.409 "raid_level": "concat", 00:20:19.409 "superblock": true, 00:20:19.409 "num_base_bdevs": 4, 00:20:19.409 "num_base_bdevs_discovered": 3, 00:20:19.409 "num_base_bdevs_operational": 4, 00:20:19.409 "base_bdevs_list": [ 00:20:19.409 { 00:20:19.409 "name": null, 00:20:19.409 "uuid": "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86", 00:20:19.409 "is_configured": false, 00:20:19.409 "data_offset": 2048, 00:20:19.409 "data_size": 63488 00:20:19.409 }, 00:20:19.409 { 00:20:19.409 "name": "BaseBdev2", 00:20:19.409 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:19.409 "is_configured": true, 00:20:19.409 "data_offset": 2048, 00:20:19.409 "data_size": 63488 00:20:19.409 }, 00:20:19.409 { 00:20:19.409 "name": "BaseBdev3", 00:20:19.409 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:19.409 "is_configured": true, 00:20:19.409 "data_offset": 2048, 00:20:19.409 "data_size": 63488 00:20:19.409 }, 00:20:19.409 { 00:20:19.409 "name": "BaseBdev4", 00:20:19.409 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:19.409 "is_configured": true, 00:20:19.409 "data_offset": 2048, 00:20:19.409 "data_size": 63488 00:20:19.409 } 00:20:19.409 ] 00:20:19.409 }' 00:20:19.409 13:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:19.409 13:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:19.975 13:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.975 13:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:20.234 13:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:20.234 13:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.234 13:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:20.495 13:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86 00:20:20.755 [2024-07-12 13:46:09.200434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:20.755 [2024-07-12 13:46:09.200590] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc43ad0 00:20:20.755 [2024-07-12 13:46:09.200603] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:20.755 [2024-07-12 13:46:09.200774] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa9e6a0 00:20:20.755 [2024-07-12 13:46:09.200895] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc43ad0 00:20:20.755 [2024-07-12 13:46:09.200905] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc43ad0 00:20:20.755 [2024-07-12 13:46:09.201004] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:20.755 NewBaseBdev 00:20:20.755 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:20.755 13:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:20.755 13:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:20.755 13:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:20.755 13:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:20.755 13:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:20.755 13:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:21.014 13:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:21.273 [ 00:20:21.273 { 00:20:21.273 "name": "NewBaseBdev", 00:20:21.273 "aliases": [ 00:20:21.273 "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86" 00:20:21.273 ], 00:20:21.273 "product_name": "Malloc disk", 00:20:21.273 "block_size": 512, 00:20:21.273 "num_blocks": 65536, 00:20:21.273 "uuid": "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86", 00:20:21.273 "assigned_rate_limits": { 00:20:21.273 "rw_ios_per_sec": 0, 00:20:21.273 "rw_mbytes_per_sec": 0, 00:20:21.273 "r_mbytes_per_sec": 0, 00:20:21.273 "w_mbytes_per_sec": 0 00:20:21.273 }, 00:20:21.273 "claimed": true, 00:20:21.273 "claim_type": "exclusive_write", 00:20:21.273 "zoned": false, 00:20:21.273 "supported_io_types": { 00:20:21.273 "read": true, 00:20:21.273 "write": true, 00:20:21.273 "unmap": true, 00:20:21.273 "flush": true, 00:20:21.273 "reset": true, 00:20:21.273 "nvme_admin": false, 00:20:21.273 "nvme_io": false, 00:20:21.273 "nvme_io_md": false, 00:20:21.273 "write_zeroes": true, 00:20:21.273 "zcopy": true, 00:20:21.274 "get_zone_info": false, 00:20:21.274 "zone_management": false, 00:20:21.274 "zone_append": false, 00:20:21.274 "compare": false, 00:20:21.274 "compare_and_write": false, 00:20:21.274 "abort": true, 00:20:21.274 "seek_hole": false, 00:20:21.274 "seek_data": false, 00:20:21.274 "copy": true, 00:20:21.274 "nvme_iov_md": false 00:20:21.274 }, 00:20:21.274 "memory_domains": [ 00:20:21.274 { 00:20:21.274 "dma_device_id": "system", 00:20:21.274 "dma_device_type": 1 00:20:21.274 }, 00:20:21.274 { 00:20:21.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.274 "dma_device_type": 2 00:20:21.274 } 00:20:21.274 ], 00:20:21.274 "driver_specific": {} 00:20:21.274 } 00:20:21.274 ] 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.274 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:21.532 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.532 "name": "Existed_Raid", 00:20:21.532 "uuid": "31046b93-ece0-4a14-af05-5285bfe450ef", 00:20:21.532 "strip_size_kb": 64, 00:20:21.532 "state": "online", 00:20:21.532 "raid_level": "concat", 00:20:21.532 "superblock": true, 00:20:21.532 "num_base_bdevs": 4, 00:20:21.532 "num_base_bdevs_discovered": 4, 00:20:21.532 "num_base_bdevs_operational": 4, 00:20:21.532 "base_bdevs_list": [ 00:20:21.532 { 00:20:21.532 "name": "NewBaseBdev", 00:20:21.533 "uuid": "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86", 00:20:21.533 "is_configured": true, 00:20:21.533 "data_offset": 2048, 00:20:21.533 "data_size": 63488 00:20:21.533 }, 00:20:21.533 { 00:20:21.533 "name": "BaseBdev2", 00:20:21.533 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:21.533 "is_configured": true, 00:20:21.533 "data_offset": 2048, 00:20:21.533 "data_size": 63488 00:20:21.533 }, 00:20:21.533 { 00:20:21.533 "name": "BaseBdev3", 00:20:21.533 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:21.533 "is_configured": true, 00:20:21.533 "data_offset": 2048, 00:20:21.533 "data_size": 63488 00:20:21.533 }, 00:20:21.533 { 00:20:21.533 "name": "BaseBdev4", 00:20:21.533 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:21.533 "is_configured": true, 00:20:21.533 "data_offset": 2048, 00:20:21.533 "data_size": 63488 00:20:21.533 } 00:20:21.533 ] 00:20:21.533 }' 00:20:21.533 13:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.533 13:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:22.101 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:22.101 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:22.101 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:22.101 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:22.101 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:22.101 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:22.101 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:22.101 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:22.360 [2024-07-12 13:46:10.772916] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:22.361 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:22.361 "name": "Existed_Raid", 00:20:22.361 "aliases": [ 00:20:22.361 "31046b93-ece0-4a14-af05-5285bfe450ef" 00:20:22.361 ], 00:20:22.361 "product_name": "Raid Volume", 00:20:22.361 "block_size": 512, 00:20:22.361 "num_blocks": 253952, 00:20:22.361 "uuid": "31046b93-ece0-4a14-af05-5285bfe450ef", 00:20:22.361 "assigned_rate_limits": { 00:20:22.361 "rw_ios_per_sec": 0, 00:20:22.361 "rw_mbytes_per_sec": 0, 00:20:22.361 "r_mbytes_per_sec": 0, 00:20:22.361 "w_mbytes_per_sec": 0 00:20:22.361 }, 00:20:22.361 "claimed": false, 00:20:22.361 "zoned": false, 00:20:22.361 "supported_io_types": { 00:20:22.361 "read": true, 00:20:22.361 "write": true, 00:20:22.361 "unmap": true, 00:20:22.361 "flush": true, 00:20:22.361 "reset": true, 00:20:22.361 "nvme_admin": false, 00:20:22.361 "nvme_io": false, 00:20:22.361 "nvme_io_md": false, 00:20:22.361 "write_zeroes": true, 00:20:22.361 "zcopy": false, 00:20:22.361 "get_zone_info": false, 00:20:22.361 "zone_management": false, 00:20:22.361 "zone_append": false, 00:20:22.361 "compare": false, 00:20:22.361 "compare_and_write": false, 00:20:22.361 "abort": false, 00:20:22.361 "seek_hole": false, 00:20:22.361 "seek_data": false, 00:20:22.361 "copy": false, 00:20:22.361 "nvme_iov_md": false 00:20:22.361 }, 00:20:22.361 "memory_domains": [ 00:20:22.361 { 00:20:22.361 "dma_device_id": "system", 00:20:22.361 "dma_device_type": 1 00:20:22.361 }, 00:20:22.361 { 00:20:22.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.361 "dma_device_type": 2 00:20:22.361 }, 00:20:22.361 { 00:20:22.361 "dma_device_id": "system", 00:20:22.361 "dma_device_type": 1 00:20:22.361 }, 00:20:22.361 { 00:20:22.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.361 "dma_device_type": 2 00:20:22.361 }, 00:20:22.361 { 00:20:22.361 "dma_device_id": "system", 00:20:22.361 "dma_device_type": 1 00:20:22.361 }, 00:20:22.361 { 00:20:22.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.361 "dma_device_type": 2 00:20:22.361 }, 00:20:22.361 { 00:20:22.361 "dma_device_id": "system", 00:20:22.361 "dma_device_type": 1 00:20:22.361 }, 00:20:22.361 { 00:20:22.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.361 "dma_device_type": 2 00:20:22.361 } 00:20:22.361 ], 00:20:22.361 "driver_specific": { 00:20:22.361 "raid": { 00:20:22.361 "uuid": "31046b93-ece0-4a14-af05-5285bfe450ef", 00:20:22.361 "strip_size_kb": 64, 00:20:22.361 "state": "online", 00:20:22.361 "raid_level": "concat", 00:20:22.361 "superblock": true, 00:20:22.361 "num_base_bdevs": 4, 00:20:22.361 "num_base_bdevs_discovered": 4, 00:20:22.361 "num_base_bdevs_operational": 4, 00:20:22.361 "base_bdevs_list": [ 00:20:22.361 { 00:20:22.361 "name": "NewBaseBdev", 00:20:22.361 "uuid": "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86", 00:20:22.361 "is_configured": true, 00:20:22.361 "data_offset": 2048, 00:20:22.361 "data_size": 63488 00:20:22.361 }, 00:20:22.361 { 00:20:22.361 "name": "BaseBdev2", 00:20:22.361 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:22.361 "is_configured": true, 00:20:22.361 "data_offset": 2048, 00:20:22.361 "data_size": 63488 00:20:22.361 }, 00:20:22.361 { 00:20:22.361 "name": "BaseBdev3", 00:20:22.361 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:22.361 "is_configured": true, 00:20:22.361 "data_offset": 2048, 00:20:22.361 "data_size": 63488 00:20:22.361 }, 00:20:22.361 { 00:20:22.361 "name": "BaseBdev4", 00:20:22.361 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:22.361 "is_configured": true, 00:20:22.361 "data_offset": 2048, 00:20:22.361 "data_size": 63488 00:20:22.361 } 00:20:22.361 ] 00:20:22.361 } 00:20:22.361 } 00:20:22.361 }' 00:20:22.361 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:22.361 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:22.361 BaseBdev2 00:20:22.361 BaseBdev3 00:20:22.361 BaseBdev4' 00:20:22.361 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.361 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:22.361 13:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:22.620 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:22.620 "name": "NewBaseBdev", 00:20:22.620 "aliases": [ 00:20:22.620 "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86" 00:20:22.620 ], 00:20:22.620 "product_name": "Malloc disk", 00:20:22.620 "block_size": 512, 00:20:22.620 "num_blocks": 65536, 00:20:22.620 "uuid": "3566b3fe-ad0d-4875-a9b8-ea9f3e5c3b86", 00:20:22.620 "assigned_rate_limits": { 00:20:22.620 "rw_ios_per_sec": 0, 00:20:22.620 "rw_mbytes_per_sec": 0, 00:20:22.620 "r_mbytes_per_sec": 0, 00:20:22.620 "w_mbytes_per_sec": 0 00:20:22.620 }, 00:20:22.620 "claimed": true, 00:20:22.620 "claim_type": "exclusive_write", 00:20:22.620 "zoned": false, 00:20:22.620 "supported_io_types": { 00:20:22.620 "read": true, 00:20:22.620 "write": true, 00:20:22.620 "unmap": true, 00:20:22.620 "flush": true, 00:20:22.620 "reset": true, 00:20:22.620 "nvme_admin": false, 00:20:22.620 "nvme_io": false, 00:20:22.620 "nvme_io_md": false, 00:20:22.620 "write_zeroes": true, 00:20:22.620 "zcopy": true, 00:20:22.620 "get_zone_info": false, 00:20:22.620 "zone_management": false, 00:20:22.620 "zone_append": false, 00:20:22.620 "compare": false, 00:20:22.620 "compare_and_write": false, 00:20:22.620 "abort": true, 00:20:22.620 "seek_hole": false, 00:20:22.620 "seek_data": false, 00:20:22.620 "copy": true, 00:20:22.620 "nvme_iov_md": false 00:20:22.620 }, 00:20:22.620 "memory_domains": [ 00:20:22.620 { 00:20:22.620 "dma_device_id": "system", 00:20:22.620 "dma_device_type": 1 00:20:22.620 }, 00:20:22.620 { 00:20:22.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.620 "dma_device_type": 2 00:20:22.620 } 00:20:22.620 ], 00:20:22.620 "driver_specific": {} 00:20:22.620 }' 00:20:22.620 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.620 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.620 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:22.620 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:22.878 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.136 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.136 "name": "BaseBdev2", 00:20:23.136 "aliases": [ 00:20:23.136 "a2f39964-ac0d-48cc-8e5c-c210c72b6c53" 00:20:23.136 ], 00:20:23.136 "product_name": "Malloc disk", 00:20:23.136 "block_size": 512, 00:20:23.136 "num_blocks": 65536, 00:20:23.136 "uuid": "a2f39964-ac0d-48cc-8e5c-c210c72b6c53", 00:20:23.136 "assigned_rate_limits": { 00:20:23.136 "rw_ios_per_sec": 0, 00:20:23.136 "rw_mbytes_per_sec": 0, 00:20:23.136 "r_mbytes_per_sec": 0, 00:20:23.136 "w_mbytes_per_sec": 0 00:20:23.136 }, 00:20:23.136 "claimed": true, 00:20:23.136 "claim_type": "exclusive_write", 00:20:23.136 "zoned": false, 00:20:23.136 "supported_io_types": { 00:20:23.136 "read": true, 00:20:23.136 "write": true, 00:20:23.136 "unmap": true, 00:20:23.136 "flush": true, 00:20:23.136 "reset": true, 00:20:23.136 "nvme_admin": false, 00:20:23.136 "nvme_io": false, 00:20:23.136 "nvme_io_md": false, 00:20:23.136 "write_zeroes": true, 00:20:23.136 "zcopy": true, 00:20:23.136 "get_zone_info": false, 00:20:23.136 "zone_management": false, 00:20:23.136 "zone_append": false, 00:20:23.136 "compare": false, 00:20:23.136 "compare_and_write": false, 00:20:23.136 "abort": true, 00:20:23.136 "seek_hole": false, 00:20:23.136 "seek_data": false, 00:20:23.136 "copy": true, 00:20:23.136 "nvme_iov_md": false 00:20:23.136 }, 00:20:23.136 "memory_domains": [ 00:20:23.136 { 00:20:23.136 "dma_device_id": "system", 00:20:23.136 "dma_device_type": 1 00:20:23.136 }, 00:20:23.136 { 00:20:23.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.136 "dma_device_type": 2 00:20:23.136 } 00:20:23.136 ], 00:20:23.136 "driver_specific": {} 00:20:23.136 }' 00:20:23.136 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.394 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.394 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.394 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.394 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.394 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.394 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.394 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.394 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.394 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.394 13:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.652 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.652 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.652 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:23.652 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.652 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.652 "name": "BaseBdev3", 00:20:23.652 "aliases": [ 00:20:23.652 "959506ab-2de0-48ee-bcc7-287e5de2e76a" 00:20:23.652 ], 00:20:23.652 "product_name": "Malloc disk", 00:20:23.652 "block_size": 512, 00:20:23.652 "num_blocks": 65536, 00:20:23.652 "uuid": "959506ab-2de0-48ee-bcc7-287e5de2e76a", 00:20:23.652 "assigned_rate_limits": { 00:20:23.652 "rw_ios_per_sec": 0, 00:20:23.652 "rw_mbytes_per_sec": 0, 00:20:23.652 "r_mbytes_per_sec": 0, 00:20:23.652 "w_mbytes_per_sec": 0 00:20:23.652 }, 00:20:23.652 "claimed": true, 00:20:23.652 "claim_type": "exclusive_write", 00:20:23.652 "zoned": false, 00:20:23.652 "supported_io_types": { 00:20:23.652 "read": true, 00:20:23.652 "write": true, 00:20:23.652 "unmap": true, 00:20:23.652 "flush": true, 00:20:23.652 "reset": true, 00:20:23.652 "nvme_admin": false, 00:20:23.652 "nvme_io": false, 00:20:23.652 "nvme_io_md": false, 00:20:23.652 "write_zeroes": true, 00:20:23.652 "zcopy": true, 00:20:23.652 "get_zone_info": false, 00:20:23.652 "zone_management": false, 00:20:23.652 "zone_append": false, 00:20:23.652 "compare": false, 00:20:23.652 "compare_and_write": false, 00:20:23.652 "abort": true, 00:20:23.652 "seek_hole": false, 00:20:23.652 "seek_data": false, 00:20:23.652 "copy": true, 00:20:23.652 "nvme_iov_md": false 00:20:23.652 }, 00:20:23.652 "memory_domains": [ 00:20:23.652 { 00:20:23.652 "dma_device_id": "system", 00:20:23.652 "dma_device_type": 1 00:20:23.652 }, 00:20:23.652 { 00:20:23.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.652 "dma_device_type": 2 00:20:23.652 } 00:20:23.652 ], 00:20:23.652 "driver_specific": {} 00:20:23.652 }' 00:20:23.652 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.910 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.910 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.910 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.910 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.910 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.910 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.910 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.910 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.910 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.168 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.168 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:24.168 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:24.168 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:24.168 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:24.429 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:24.429 "name": "BaseBdev4", 00:20:24.429 "aliases": [ 00:20:24.429 "a5f4466b-7322-49bf-b540-03700c9f64e3" 00:20:24.429 ], 00:20:24.429 "product_name": "Malloc disk", 00:20:24.429 "block_size": 512, 00:20:24.429 "num_blocks": 65536, 00:20:24.429 "uuid": "a5f4466b-7322-49bf-b540-03700c9f64e3", 00:20:24.429 "assigned_rate_limits": { 00:20:24.429 "rw_ios_per_sec": 0, 00:20:24.429 "rw_mbytes_per_sec": 0, 00:20:24.429 "r_mbytes_per_sec": 0, 00:20:24.429 "w_mbytes_per_sec": 0 00:20:24.429 }, 00:20:24.429 "claimed": true, 00:20:24.429 "claim_type": "exclusive_write", 00:20:24.429 "zoned": false, 00:20:24.429 "supported_io_types": { 00:20:24.429 "read": true, 00:20:24.429 "write": true, 00:20:24.429 "unmap": true, 00:20:24.429 "flush": true, 00:20:24.429 "reset": true, 00:20:24.429 "nvme_admin": false, 00:20:24.429 "nvme_io": false, 00:20:24.429 "nvme_io_md": false, 00:20:24.429 "write_zeroes": true, 00:20:24.429 "zcopy": true, 00:20:24.429 "get_zone_info": false, 00:20:24.429 "zone_management": false, 00:20:24.429 "zone_append": false, 00:20:24.429 "compare": false, 00:20:24.429 "compare_and_write": false, 00:20:24.429 "abort": true, 00:20:24.429 "seek_hole": false, 00:20:24.429 "seek_data": false, 00:20:24.429 "copy": true, 00:20:24.429 "nvme_iov_md": false 00:20:24.429 }, 00:20:24.429 "memory_domains": [ 00:20:24.429 { 00:20:24.429 "dma_device_id": "system", 00:20:24.429 "dma_device_type": 1 00:20:24.429 }, 00:20:24.429 { 00:20:24.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.429 "dma_device_type": 2 00:20:24.429 } 00:20:24.429 ], 00:20:24.429 "driver_specific": {} 00:20:24.429 }' 00:20:24.429 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.429 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.429 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:24.429 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.429 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.429 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:24.429 13:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.713 13:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.713 13:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:24.713 13:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.713 13:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.713 13:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:24.713 13:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:25.024 [2024-07-12 13:46:13.295316] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:25.024 [2024-07-12 13:46:13.295340] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:25.024 [2024-07-12 13:46:13.295390] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:25.024 [2024-07-12 13:46:13.295450] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:25.024 [2024-07-12 13:46:13.295462] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc43ad0 name Existed_Raid, state offline 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 510722 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 510722 ']' 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 510722 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 510722 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 510722' 00:20:25.024 killing process with pid 510722 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 510722 00:20:25.024 [2024-07-12 13:46:13.377834] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:25.024 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 510722 00:20:25.024 [2024-07-12 13:46:13.416710] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:25.383 13:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:25.383 00:20:25.383 real 0m34.987s 00:20:25.383 user 1m4.282s 00:20:25.383 sys 0m6.170s 00:20:25.383 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:25.383 13:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:25.383 ************************************ 00:20:25.383 END TEST raid_state_function_test_sb 00:20:25.383 ************************************ 00:20:25.383 13:46:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:25.383 13:46:13 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:20:25.383 13:46:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:20:25.383 13:46:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:25.383 13:46:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:25.383 ************************************ 00:20:25.383 START TEST raid_superblock_test 00:20:25.383 ************************************ 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=515867 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 515867 /var/tmp/spdk-raid.sock 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 515867 ']' 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:25.383 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:25.383 13:46:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:25.383 [2024-07-12 13:46:13.803345] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:20:25.383 [2024-07-12 13:46:13.803415] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid515867 ] 00:20:25.383 [2024-07-12 13:46:13.922276] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:25.642 [2024-07-12 13:46:14.027175] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:25.642 [2024-07-12 13:46:14.087343] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:25.642 [2024-07-12 13:46:14.087382] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:26.209 13:46:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:26.209 13:46:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:20:26.209 13:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:26.209 13:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:26.209 13:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:26.210 13:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:26.210 13:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:26.210 13:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:26.210 13:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:26.210 13:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:26.210 13:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:26.469 malloc1 00:20:26.469 13:46:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:26.728 [2024-07-12 13:46:15.204282] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:26.728 [2024-07-12 13:46:15.204332] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:26.728 [2024-07-12 13:46:15.204353] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2036e90 00:20:26.728 [2024-07-12 13:46:15.204366] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:26.728 [2024-07-12 13:46:15.206045] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:26.728 [2024-07-12 13:46:15.206075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:26.728 pt1 00:20:26.728 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:26.728 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:26.728 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:26.728 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:26.728 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:26.728 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:26.728 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:26.728 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:26.728 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:26.987 malloc2 00:20:26.987 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:27.246 [2024-07-12 13:46:15.699564] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:27.246 [2024-07-12 13:46:15.699611] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:27.246 [2024-07-12 13:46:15.699628] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d4fb0 00:20:27.246 [2024-07-12 13:46:15.699641] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:27.246 [2024-07-12 13:46:15.701218] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:27.246 [2024-07-12 13:46:15.701247] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:27.246 pt2 00:20:27.246 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:27.246 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:27.246 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:27.246 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:27.246 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:27.246 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:27.246 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:27.247 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:27.247 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:27.506 malloc3 00:20:27.506 13:46:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:27.765 [2024-07-12 13:46:16.193529] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:27.765 [2024-07-12 13:46:16.193575] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:27.765 [2024-07-12 13:46:16.193593] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d5ce0 00:20:27.765 [2024-07-12 13:46:16.193605] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:27.765 [2024-07-12 13:46:16.195146] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:27.765 [2024-07-12 13:46:16.195174] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:27.765 pt3 00:20:27.765 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:27.765 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:27.765 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:20:27.765 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:20:27.765 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:27.765 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:27.765 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:27.765 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:27.765 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:28.024 malloc4 00:20:28.024 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:28.592 [2024-07-12 13:46:16.944107] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:28.592 [2024-07-12 13:46:16.944157] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:28.592 [2024-07-12 13:46:16.944174] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d8450 00:20:28.592 [2024-07-12 13:46:16.944187] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:28.592 [2024-07-12 13:46:16.945790] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:28.592 [2024-07-12 13:46:16.945817] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:28.592 pt4 00:20:28.592 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:28.592 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:28.592 13:46:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:28.850 [2024-07-12 13:46:17.200801] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:28.850 [2024-07-12 13:46:17.202162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:28.850 [2024-07-12 13:46:17.202218] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:28.850 [2024-07-12 13:46:17.202263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:28.850 [2024-07-12 13:46:17.202434] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2039c20 00:20:28.850 [2024-07-12 13:46:17.202446] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:28.850 [2024-07-12 13:46:17.202653] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20da910 00:20:28.850 [2024-07-12 13:46:17.202804] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2039c20 00:20:28.850 [2024-07-12 13:46:17.202814] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2039c20 00:20:28.850 [2024-07-12 13:46:17.202911] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.850 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.110 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.110 "name": "raid_bdev1", 00:20:29.110 "uuid": "29d5b57f-75d0-494b-a44e-7f5e90d2a4ff", 00:20:29.110 "strip_size_kb": 64, 00:20:29.110 "state": "online", 00:20:29.110 "raid_level": "concat", 00:20:29.110 "superblock": true, 00:20:29.110 "num_base_bdevs": 4, 00:20:29.110 "num_base_bdevs_discovered": 4, 00:20:29.110 "num_base_bdevs_operational": 4, 00:20:29.110 "base_bdevs_list": [ 00:20:29.110 { 00:20:29.110 "name": "pt1", 00:20:29.110 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:29.110 "is_configured": true, 00:20:29.110 "data_offset": 2048, 00:20:29.110 "data_size": 63488 00:20:29.110 }, 00:20:29.110 { 00:20:29.110 "name": "pt2", 00:20:29.110 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:29.110 "is_configured": true, 00:20:29.110 "data_offset": 2048, 00:20:29.110 "data_size": 63488 00:20:29.110 }, 00:20:29.110 { 00:20:29.110 "name": "pt3", 00:20:29.110 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:29.110 "is_configured": true, 00:20:29.110 "data_offset": 2048, 00:20:29.110 "data_size": 63488 00:20:29.110 }, 00:20:29.110 { 00:20:29.110 "name": "pt4", 00:20:29.110 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:29.110 "is_configured": true, 00:20:29.110 "data_offset": 2048, 00:20:29.110 "data_size": 63488 00:20:29.110 } 00:20:29.110 ] 00:20:29.110 }' 00:20:29.110 13:46:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.110 13:46:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:29.678 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:29.678 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:29.678 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:29.678 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:29.678 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:29.678 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:29.678 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:29.678 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:29.938 [2024-07-12 13:46:18.300008] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:29.938 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:29.938 "name": "raid_bdev1", 00:20:29.938 "aliases": [ 00:20:29.938 "29d5b57f-75d0-494b-a44e-7f5e90d2a4ff" 00:20:29.938 ], 00:20:29.938 "product_name": "Raid Volume", 00:20:29.938 "block_size": 512, 00:20:29.938 "num_blocks": 253952, 00:20:29.938 "uuid": "29d5b57f-75d0-494b-a44e-7f5e90d2a4ff", 00:20:29.938 "assigned_rate_limits": { 00:20:29.938 "rw_ios_per_sec": 0, 00:20:29.938 "rw_mbytes_per_sec": 0, 00:20:29.938 "r_mbytes_per_sec": 0, 00:20:29.938 "w_mbytes_per_sec": 0 00:20:29.938 }, 00:20:29.938 "claimed": false, 00:20:29.938 "zoned": false, 00:20:29.938 "supported_io_types": { 00:20:29.938 "read": true, 00:20:29.938 "write": true, 00:20:29.938 "unmap": true, 00:20:29.938 "flush": true, 00:20:29.938 "reset": true, 00:20:29.938 "nvme_admin": false, 00:20:29.938 "nvme_io": false, 00:20:29.938 "nvme_io_md": false, 00:20:29.938 "write_zeroes": true, 00:20:29.938 "zcopy": false, 00:20:29.938 "get_zone_info": false, 00:20:29.938 "zone_management": false, 00:20:29.938 "zone_append": false, 00:20:29.938 "compare": false, 00:20:29.938 "compare_and_write": false, 00:20:29.938 "abort": false, 00:20:29.938 "seek_hole": false, 00:20:29.938 "seek_data": false, 00:20:29.938 "copy": false, 00:20:29.938 "nvme_iov_md": false 00:20:29.938 }, 00:20:29.938 "memory_domains": [ 00:20:29.938 { 00:20:29.938 "dma_device_id": "system", 00:20:29.938 "dma_device_type": 1 00:20:29.938 }, 00:20:29.938 { 00:20:29.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.938 "dma_device_type": 2 00:20:29.938 }, 00:20:29.938 { 00:20:29.938 "dma_device_id": "system", 00:20:29.938 "dma_device_type": 1 00:20:29.938 }, 00:20:29.938 { 00:20:29.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.938 "dma_device_type": 2 00:20:29.938 }, 00:20:29.938 { 00:20:29.938 "dma_device_id": "system", 00:20:29.938 "dma_device_type": 1 00:20:29.938 }, 00:20:29.938 { 00:20:29.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.938 "dma_device_type": 2 00:20:29.938 }, 00:20:29.938 { 00:20:29.938 "dma_device_id": "system", 00:20:29.938 "dma_device_type": 1 00:20:29.938 }, 00:20:29.938 { 00:20:29.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.938 "dma_device_type": 2 00:20:29.938 } 00:20:29.938 ], 00:20:29.938 "driver_specific": { 00:20:29.938 "raid": { 00:20:29.938 "uuid": "29d5b57f-75d0-494b-a44e-7f5e90d2a4ff", 00:20:29.938 "strip_size_kb": 64, 00:20:29.938 "state": "online", 00:20:29.938 "raid_level": "concat", 00:20:29.938 "superblock": true, 00:20:29.938 "num_base_bdevs": 4, 00:20:29.938 "num_base_bdevs_discovered": 4, 00:20:29.938 "num_base_bdevs_operational": 4, 00:20:29.938 "base_bdevs_list": [ 00:20:29.938 { 00:20:29.938 "name": "pt1", 00:20:29.938 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:29.938 "is_configured": true, 00:20:29.938 "data_offset": 2048, 00:20:29.938 "data_size": 63488 00:20:29.938 }, 00:20:29.938 { 00:20:29.938 "name": "pt2", 00:20:29.938 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:29.938 "is_configured": true, 00:20:29.938 "data_offset": 2048, 00:20:29.938 "data_size": 63488 00:20:29.938 }, 00:20:29.938 { 00:20:29.938 "name": "pt3", 00:20:29.938 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:29.938 "is_configured": true, 00:20:29.938 "data_offset": 2048, 00:20:29.938 "data_size": 63488 00:20:29.938 }, 00:20:29.938 { 00:20:29.938 "name": "pt4", 00:20:29.938 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:29.938 "is_configured": true, 00:20:29.938 "data_offset": 2048, 00:20:29.938 "data_size": 63488 00:20:29.938 } 00:20:29.938 ] 00:20:29.938 } 00:20:29.938 } 00:20:29.938 }' 00:20:29.938 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:29.938 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:29.938 pt2 00:20:29.938 pt3 00:20:29.938 pt4' 00:20:29.938 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:29.938 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:29.938 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:30.197 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:30.197 "name": "pt1", 00:20:30.197 "aliases": [ 00:20:30.197 "00000000-0000-0000-0000-000000000001" 00:20:30.197 ], 00:20:30.197 "product_name": "passthru", 00:20:30.197 "block_size": 512, 00:20:30.197 "num_blocks": 65536, 00:20:30.197 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:30.197 "assigned_rate_limits": { 00:20:30.197 "rw_ios_per_sec": 0, 00:20:30.197 "rw_mbytes_per_sec": 0, 00:20:30.197 "r_mbytes_per_sec": 0, 00:20:30.197 "w_mbytes_per_sec": 0 00:20:30.197 }, 00:20:30.197 "claimed": true, 00:20:30.197 "claim_type": "exclusive_write", 00:20:30.197 "zoned": false, 00:20:30.197 "supported_io_types": { 00:20:30.197 "read": true, 00:20:30.197 "write": true, 00:20:30.197 "unmap": true, 00:20:30.197 "flush": true, 00:20:30.197 "reset": true, 00:20:30.197 "nvme_admin": false, 00:20:30.197 "nvme_io": false, 00:20:30.197 "nvme_io_md": false, 00:20:30.197 "write_zeroes": true, 00:20:30.197 "zcopy": true, 00:20:30.197 "get_zone_info": false, 00:20:30.197 "zone_management": false, 00:20:30.197 "zone_append": false, 00:20:30.197 "compare": false, 00:20:30.197 "compare_and_write": false, 00:20:30.197 "abort": true, 00:20:30.197 "seek_hole": false, 00:20:30.197 "seek_data": false, 00:20:30.197 "copy": true, 00:20:30.197 "nvme_iov_md": false 00:20:30.197 }, 00:20:30.197 "memory_domains": [ 00:20:30.197 { 00:20:30.197 "dma_device_id": "system", 00:20:30.197 "dma_device_type": 1 00:20:30.197 }, 00:20:30.197 { 00:20:30.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.197 "dma_device_type": 2 00:20:30.197 } 00:20:30.197 ], 00:20:30.197 "driver_specific": { 00:20:30.197 "passthru": { 00:20:30.197 "name": "pt1", 00:20:30.197 "base_bdev_name": "malloc1" 00:20:30.197 } 00:20:30.197 } 00:20:30.197 }' 00:20:30.197 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.197 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.197 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:30.197 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:30.197 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:30.197 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:30.456 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:30.456 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:30.456 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:30.456 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:30.456 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:30.456 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:30.456 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:30.456 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:30.456 13:46:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:30.714 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:30.714 "name": "pt2", 00:20:30.714 "aliases": [ 00:20:30.714 "00000000-0000-0000-0000-000000000002" 00:20:30.714 ], 00:20:30.714 "product_name": "passthru", 00:20:30.714 "block_size": 512, 00:20:30.714 "num_blocks": 65536, 00:20:30.714 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:30.714 "assigned_rate_limits": { 00:20:30.714 "rw_ios_per_sec": 0, 00:20:30.714 "rw_mbytes_per_sec": 0, 00:20:30.714 "r_mbytes_per_sec": 0, 00:20:30.714 "w_mbytes_per_sec": 0 00:20:30.714 }, 00:20:30.714 "claimed": true, 00:20:30.714 "claim_type": "exclusive_write", 00:20:30.714 "zoned": false, 00:20:30.714 "supported_io_types": { 00:20:30.714 "read": true, 00:20:30.714 "write": true, 00:20:30.714 "unmap": true, 00:20:30.714 "flush": true, 00:20:30.714 "reset": true, 00:20:30.714 "nvme_admin": false, 00:20:30.714 "nvme_io": false, 00:20:30.714 "nvme_io_md": false, 00:20:30.714 "write_zeroes": true, 00:20:30.714 "zcopy": true, 00:20:30.714 "get_zone_info": false, 00:20:30.714 "zone_management": false, 00:20:30.714 "zone_append": false, 00:20:30.714 "compare": false, 00:20:30.714 "compare_and_write": false, 00:20:30.714 "abort": true, 00:20:30.714 "seek_hole": false, 00:20:30.714 "seek_data": false, 00:20:30.714 "copy": true, 00:20:30.714 "nvme_iov_md": false 00:20:30.714 }, 00:20:30.714 "memory_domains": [ 00:20:30.714 { 00:20:30.714 "dma_device_id": "system", 00:20:30.714 "dma_device_type": 1 00:20:30.715 }, 00:20:30.715 { 00:20:30.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.715 "dma_device_type": 2 00:20:30.715 } 00:20:30.715 ], 00:20:30.715 "driver_specific": { 00:20:30.715 "passthru": { 00:20:30.715 "name": "pt2", 00:20:30.715 "base_bdev_name": "malloc2" 00:20:30.715 } 00:20:30.715 } 00:20:30.715 }' 00:20:30.715 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.715 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.973 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:30.973 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:30.973 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:30.973 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:30.973 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:30.973 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:30.973 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:30.973 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:30.973 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.233 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:31.233 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.233 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:31.233 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:31.233 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:31.233 "name": "pt3", 00:20:31.233 "aliases": [ 00:20:31.233 "00000000-0000-0000-0000-000000000003" 00:20:31.233 ], 00:20:31.233 "product_name": "passthru", 00:20:31.233 "block_size": 512, 00:20:31.233 "num_blocks": 65536, 00:20:31.233 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:31.233 "assigned_rate_limits": { 00:20:31.233 "rw_ios_per_sec": 0, 00:20:31.233 "rw_mbytes_per_sec": 0, 00:20:31.233 "r_mbytes_per_sec": 0, 00:20:31.233 "w_mbytes_per_sec": 0 00:20:31.233 }, 00:20:31.233 "claimed": true, 00:20:31.233 "claim_type": "exclusive_write", 00:20:31.233 "zoned": false, 00:20:31.233 "supported_io_types": { 00:20:31.233 "read": true, 00:20:31.233 "write": true, 00:20:31.233 "unmap": true, 00:20:31.233 "flush": true, 00:20:31.233 "reset": true, 00:20:31.233 "nvme_admin": false, 00:20:31.233 "nvme_io": false, 00:20:31.233 "nvme_io_md": false, 00:20:31.233 "write_zeroes": true, 00:20:31.233 "zcopy": true, 00:20:31.233 "get_zone_info": false, 00:20:31.233 "zone_management": false, 00:20:31.233 "zone_append": false, 00:20:31.233 "compare": false, 00:20:31.233 "compare_and_write": false, 00:20:31.233 "abort": true, 00:20:31.233 "seek_hole": false, 00:20:31.233 "seek_data": false, 00:20:31.233 "copy": true, 00:20:31.233 "nvme_iov_md": false 00:20:31.233 }, 00:20:31.233 "memory_domains": [ 00:20:31.233 { 00:20:31.233 "dma_device_id": "system", 00:20:31.233 "dma_device_type": 1 00:20:31.233 }, 00:20:31.233 { 00:20:31.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.233 "dma_device_type": 2 00:20:31.233 } 00:20:31.233 ], 00:20:31.233 "driver_specific": { 00:20:31.233 "passthru": { 00:20:31.233 "name": "pt3", 00:20:31.233 "base_bdev_name": "malloc3" 00:20:31.233 } 00:20:31.233 } 00:20:31.233 }' 00:20:31.233 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.492 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.492 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:31.492 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.492 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.492 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:31.492 13:46:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.492 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.492 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:31.492 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.751 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.751 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:31.751 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.751 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:31.751 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:32.009 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:32.009 "name": "pt4", 00:20:32.009 "aliases": [ 00:20:32.009 "00000000-0000-0000-0000-000000000004" 00:20:32.009 ], 00:20:32.009 "product_name": "passthru", 00:20:32.009 "block_size": 512, 00:20:32.009 "num_blocks": 65536, 00:20:32.010 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:32.010 "assigned_rate_limits": { 00:20:32.010 "rw_ios_per_sec": 0, 00:20:32.010 "rw_mbytes_per_sec": 0, 00:20:32.010 "r_mbytes_per_sec": 0, 00:20:32.010 "w_mbytes_per_sec": 0 00:20:32.010 }, 00:20:32.010 "claimed": true, 00:20:32.010 "claim_type": "exclusive_write", 00:20:32.010 "zoned": false, 00:20:32.010 "supported_io_types": { 00:20:32.010 "read": true, 00:20:32.010 "write": true, 00:20:32.010 "unmap": true, 00:20:32.010 "flush": true, 00:20:32.010 "reset": true, 00:20:32.010 "nvme_admin": false, 00:20:32.010 "nvme_io": false, 00:20:32.010 "nvme_io_md": false, 00:20:32.010 "write_zeroes": true, 00:20:32.010 "zcopy": true, 00:20:32.010 "get_zone_info": false, 00:20:32.010 "zone_management": false, 00:20:32.010 "zone_append": false, 00:20:32.010 "compare": false, 00:20:32.010 "compare_and_write": false, 00:20:32.010 "abort": true, 00:20:32.010 "seek_hole": false, 00:20:32.010 "seek_data": false, 00:20:32.010 "copy": true, 00:20:32.010 "nvme_iov_md": false 00:20:32.010 }, 00:20:32.010 "memory_domains": [ 00:20:32.010 { 00:20:32.010 "dma_device_id": "system", 00:20:32.010 "dma_device_type": 1 00:20:32.010 }, 00:20:32.010 { 00:20:32.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.010 "dma_device_type": 2 00:20:32.010 } 00:20:32.010 ], 00:20:32.010 "driver_specific": { 00:20:32.010 "passthru": { 00:20:32.010 "name": "pt4", 00:20:32.010 "base_bdev_name": "malloc4" 00:20:32.010 } 00:20:32.010 } 00:20:32.010 }' 00:20:32.010 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.010 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.010 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:32.010 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.010 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.010 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:32.010 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.269 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.269 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:32.269 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.269 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.269 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:32.269 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:32.269 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:32.527 [2024-07-12 13:46:20.971113] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:32.528 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=29d5b57f-75d0-494b-a44e-7f5e90d2a4ff 00:20:32.528 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 29d5b57f-75d0-494b-a44e-7f5e90d2a4ff ']' 00:20:32.528 13:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:32.786 [2024-07-12 13:46:21.215437] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:32.786 [2024-07-12 13:46:21.215457] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:32.786 [2024-07-12 13:46:21.215504] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:32.786 [2024-07-12 13:46:21.215566] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:32.786 [2024-07-12 13:46:21.215578] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2039c20 name raid_bdev1, state offline 00:20:32.786 13:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.786 13:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:33.045 13:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:33.045 13:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:33.045 13:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:33.045 13:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:33.303 13:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:33.303 13:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:33.562 13:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:33.562 13:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:33.820 13:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:33.820 13:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:34.079 13:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:34.079 13:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:34.338 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:34.338 [2024-07-12 13:46:22.915879] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:34.338 [2024-07-12 13:46:22.917273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:34.338 [2024-07-12 13:46:22.917319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:34.338 [2024-07-12 13:46:22.917355] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:34.338 [2024-07-12 13:46:22.917401] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:34.338 [2024-07-12 13:46:22.917440] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:34.338 [2024-07-12 13:46:22.917470] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:34.338 [2024-07-12 13:46:22.917492] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:34.338 [2024-07-12 13:46:22.917509] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:34.338 [2024-07-12 13:46:22.917520] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20d9a40 name raid_bdev1, state configuring 00:20:34.598 request: 00:20:34.598 { 00:20:34.598 "name": "raid_bdev1", 00:20:34.598 "raid_level": "concat", 00:20:34.598 "base_bdevs": [ 00:20:34.598 "malloc1", 00:20:34.598 "malloc2", 00:20:34.598 "malloc3", 00:20:34.598 "malloc4" 00:20:34.598 ], 00:20:34.598 "strip_size_kb": 64, 00:20:34.598 "superblock": false, 00:20:34.598 "method": "bdev_raid_create", 00:20:34.598 "req_id": 1 00:20:34.598 } 00:20:34.598 Got JSON-RPC error response 00:20:34.598 response: 00:20:34.598 { 00:20:34.598 "code": -17, 00:20:34.598 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:34.598 } 00:20:34.598 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:20:34.598 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:34.598 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:34.598 13:46:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:34.598 13:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.598 13:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:34.856 [2024-07-12 13:46:23.413125] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:34.856 [2024-07-12 13:46:23.413168] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.856 [2024-07-12 13:46:23.413186] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2037ab0 00:20:34.856 [2024-07-12 13:46:23.413198] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.856 [2024-07-12 13:46:23.414812] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.856 [2024-07-12 13:46:23.414841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:34.856 [2024-07-12 13:46:23.414908] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:34.856 [2024-07-12 13:46:23.414943] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:34.856 pt1 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:34.856 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.115 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:35.115 "name": "raid_bdev1", 00:20:35.115 "uuid": "29d5b57f-75d0-494b-a44e-7f5e90d2a4ff", 00:20:35.115 "strip_size_kb": 64, 00:20:35.115 "state": "configuring", 00:20:35.115 "raid_level": "concat", 00:20:35.115 "superblock": true, 00:20:35.115 "num_base_bdevs": 4, 00:20:35.115 "num_base_bdevs_discovered": 1, 00:20:35.115 "num_base_bdevs_operational": 4, 00:20:35.115 "base_bdevs_list": [ 00:20:35.115 { 00:20:35.115 "name": "pt1", 00:20:35.115 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:35.115 "is_configured": true, 00:20:35.115 "data_offset": 2048, 00:20:35.115 "data_size": 63488 00:20:35.115 }, 00:20:35.115 { 00:20:35.115 "name": null, 00:20:35.115 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:35.115 "is_configured": false, 00:20:35.115 "data_offset": 2048, 00:20:35.115 "data_size": 63488 00:20:35.115 }, 00:20:35.115 { 00:20:35.115 "name": null, 00:20:35.115 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:35.115 "is_configured": false, 00:20:35.115 "data_offset": 2048, 00:20:35.115 "data_size": 63488 00:20:35.115 }, 00:20:35.115 { 00:20:35.115 "name": null, 00:20:35.115 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:35.115 "is_configured": false, 00:20:35.115 "data_offset": 2048, 00:20:35.115 "data_size": 63488 00:20:35.115 } 00:20:35.115 ] 00:20:35.115 }' 00:20:35.115 13:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:35.115 13:46:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:36.051 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:20:36.051 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:36.051 [2024-07-12 13:46:24.496019] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:36.051 [2024-07-12 13:46:24.496089] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:36.051 [2024-07-12 13:46:24.496118] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20399b0 00:20:36.051 [2024-07-12 13:46:24.496137] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:36.051 [2024-07-12 13:46:24.496475] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:36.051 [2024-07-12 13:46:24.496494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:36.051 [2024-07-12 13:46:24.496556] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:36.051 [2024-07-12 13:46:24.496574] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:36.051 pt2 00:20:36.051 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:36.310 [2024-07-12 13:46:24.744682] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.310 13:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:36.569 13:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.569 "name": "raid_bdev1", 00:20:36.569 "uuid": "29d5b57f-75d0-494b-a44e-7f5e90d2a4ff", 00:20:36.569 "strip_size_kb": 64, 00:20:36.569 "state": "configuring", 00:20:36.569 "raid_level": "concat", 00:20:36.569 "superblock": true, 00:20:36.569 "num_base_bdevs": 4, 00:20:36.569 "num_base_bdevs_discovered": 1, 00:20:36.569 "num_base_bdevs_operational": 4, 00:20:36.569 "base_bdevs_list": [ 00:20:36.569 { 00:20:36.569 "name": "pt1", 00:20:36.569 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:36.569 "is_configured": true, 00:20:36.569 "data_offset": 2048, 00:20:36.569 "data_size": 63488 00:20:36.569 }, 00:20:36.569 { 00:20:36.569 "name": null, 00:20:36.569 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:36.569 "is_configured": false, 00:20:36.569 "data_offset": 2048, 00:20:36.569 "data_size": 63488 00:20:36.569 }, 00:20:36.569 { 00:20:36.569 "name": null, 00:20:36.569 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:36.569 "is_configured": false, 00:20:36.569 "data_offset": 2048, 00:20:36.569 "data_size": 63488 00:20:36.569 }, 00:20:36.569 { 00:20:36.569 "name": null, 00:20:36.569 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:36.569 "is_configured": false, 00:20:36.569 "data_offset": 2048, 00:20:36.569 "data_size": 63488 00:20:36.569 } 00:20:36.569 ] 00:20:36.569 }' 00:20:36.569 13:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.569 13:46:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:37.136 13:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:37.137 13:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:37.137 13:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:37.395 [2024-07-12 13:46:25.895743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:37.395 [2024-07-12 13:46:25.895791] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:37.395 [2024-07-12 13:46:25.895809] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d9190 00:20:37.395 [2024-07-12 13:46:25.895822] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:37.395 [2024-07-12 13:46:25.896164] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:37.395 [2024-07-12 13:46:25.896183] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:37.395 [2024-07-12 13:46:25.896245] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:37.395 [2024-07-12 13:46:25.896264] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:37.395 pt2 00:20:37.395 13:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:37.395 13:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:37.395 13:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:37.654 [2024-07-12 13:46:26.140395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:37.654 [2024-07-12 13:46:26.140443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:37.654 [2024-07-12 13:46:26.140463] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d5450 00:20:37.654 [2024-07-12 13:46:26.140476] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:37.654 [2024-07-12 13:46:26.140802] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:37.654 [2024-07-12 13:46:26.140822] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:37.654 [2024-07-12 13:46:26.140882] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:37.654 [2024-07-12 13:46:26.140900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:37.654 pt3 00:20:37.654 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:37.654 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:37.654 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:37.913 [2024-07-12 13:46:26.385041] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:37.913 [2024-07-12 13:46:26.385083] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:37.913 [2024-07-12 13:46:26.385102] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d97a0 00:20:37.913 [2024-07-12 13:46:26.385114] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:37.913 [2024-07-12 13:46:26.385445] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:37.913 [2024-07-12 13:46:26.385462] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:37.913 [2024-07-12 13:46:26.385523] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:37.913 [2024-07-12 13:46:26.385541] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:37.913 [2024-07-12 13:46:26.385659] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20e2f40 00:20:37.913 [2024-07-12 13:46:26.385669] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:37.913 [2024-07-12 13:46:26.385838] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2031310 00:20:37.913 [2024-07-12 13:46:26.385982] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20e2f40 00:20:37.913 [2024-07-12 13:46:26.385993] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20e2f40 00:20:37.913 [2024-07-12 13:46:26.386092] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:37.913 pt4 00:20:37.913 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:37.913 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:37.913 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:37.913 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:37.913 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:37.913 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:37.914 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:37.914 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:37.914 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.914 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.914 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.914 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.914 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.914 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:38.173 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:38.173 "name": "raid_bdev1", 00:20:38.173 "uuid": "29d5b57f-75d0-494b-a44e-7f5e90d2a4ff", 00:20:38.173 "strip_size_kb": 64, 00:20:38.173 "state": "online", 00:20:38.173 "raid_level": "concat", 00:20:38.173 "superblock": true, 00:20:38.173 "num_base_bdevs": 4, 00:20:38.173 "num_base_bdevs_discovered": 4, 00:20:38.173 "num_base_bdevs_operational": 4, 00:20:38.173 "base_bdevs_list": [ 00:20:38.173 { 00:20:38.173 "name": "pt1", 00:20:38.173 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:38.173 "is_configured": true, 00:20:38.173 "data_offset": 2048, 00:20:38.173 "data_size": 63488 00:20:38.173 }, 00:20:38.173 { 00:20:38.173 "name": "pt2", 00:20:38.173 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:38.173 "is_configured": true, 00:20:38.173 "data_offset": 2048, 00:20:38.173 "data_size": 63488 00:20:38.173 }, 00:20:38.173 { 00:20:38.173 "name": "pt3", 00:20:38.173 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:38.173 "is_configured": true, 00:20:38.173 "data_offset": 2048, 00:20:38.173 "data_size": 63488 00:20:38.173 }, 00:20:38.173 { 00:20:38.173 "name": "pt4", 00:20:38.173 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:38.173 "is_configured": true, 00:20:38.173 "data_offset": 2048, 00:20:38.173 "data_size": 63488 00:20:38.173 } 00:20:38.173 ] 00:20:38.173 }' 00:20:38.173 13:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:38.173 13:46:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:38.741 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:38.741 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:38.741 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:38.741 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:38.741 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:38.741 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:38.741 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:38.741 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:39.000 [2024-07-12 13:46:27.540416] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:39.000 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:39.000 "name": "raid_bdev1", 00:20:39.000 "aliases": [ 00:20:39.000 "29d5b57f-75d0-494b-a44e-7f5e90d2a4ff" 00:20:39.000 ], 00:20:39.000 "product_name": "Raid Volume", 00:20:39.000 "block_size": 512, 00:20:39.000 "num_blocks": 253952, 00:20:39.000 "uuid": "29d5b57f-75d0-494b-a44e-7f5e90d2a4ff", 00:20:39.000 "assigned_rate_limits": { 00:20:39.000 "rw_ios_per_sec": 0, 00:20:39.000 "rw_mbytes_per_sec": 0, 00:20:39.000 "r_mbytes_per_sec": 0, 00:20:39.000 "w_mbytes_per_sec": 0 00:20:39.000 }, 00:20:39.000 "claimed": false, 00:20:39.000 "zoned": false, 00:20:39.000 "supported_io_types": { 00:20:39.000 "read": true, 00:20:39.000 "write": true, 00:20:39.000 "unmap": true, 00:20:39.000 "flush": true, 00:20:39.000 "reset": true, 00:20:39.000 "nvme_admin": false, 00:20:39.000 "nvme_io": false, 00:20:39.000 "nvme_io_md": false, 00:20:39.000 "write_zeroes": true, 00:20:39.000 "zcopy": false, 00:20:39.000 "get_zone_info": false, 00:20:39.000 "zone_management": false, 00:20:39.000 "zone_append": false, 00:20:39.000 "compare": false, 00:20:39.000 "compare_and_write": false, 00:20:39.000 "abort": false, 00:20:39.000 "seek_hole": false, 00:20:39.000 "seek_data": false, 00:20:39.000 "copy": false, 00:20:39.000 "nvme_iov_md": false 00:20:39.000 }, 00:20:39.000 "memory_domains": [ 00:20:39.000 { 00:20:39.000 "dma_device_id": "system", 00:20:39.000 "dma_device_type": 1 00:20:39.000 }, 00:20:39.000 { 00:20:39.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.000 "dma_device_type": 2 00:20:39.000 }, 00:20:39.000 { 00:20:39.000 "dma_device_id": "system", 00:20:39.000 "dma_device_type": 1 00:20:39.000 }, 00:20:39.000 { 00:20:39.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.000 "dma_device_type": 2 00:20:39.000 }, 00:20:39.000 { 00:20:39.000 "dma_device_id": "system", 00:20:39.000 "dma_device_type": 1 00:20:39.000 }, 00:20:39.000 { 00:20:39.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.000 "dma_device_type": 2 00:20:39.000 }, 00:20:39.000 { 00:20:39.000 "dma_device_id": "system", 00:20:39.000 "dma_device_type": 1 00:20:39.000 }, 00:20:39.000 { 00:20:39.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.000 "dma_device_type": 2 00:20:39.000 } 00:20:39.000 ], 00:20:39.000 "driver_specific": { 00:20:39.000 "raid": { 00:20:39.000 "uuid": "29d5b57f-75d0-494b-a44e-7f5e90d2a4ff", 00:20:39.000 "strip_size_kb": 64, 00:20:39.000 "state": "online", 00:20:39.000 "raid_level": "concat", 00:20:39.000 "superblock": true, 00:20:39.000 "num_base_bdevs": 4, 00:20:39.000 "num_base_bdevs_discovered": 4, 00:20:39.000 "num_base_bdevs_operational": 4, 00:20:39.000 "base_bdevs_list": [ 00:20:39.000 { 00:20:39.000 "name": "pt1", 00:20:39.000 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:39.000 "is_configured": true, 00:20:39.000 "data_offset": 2048, 00:20:39.000 "data_size": 63488 00:20:39.000 }, 00:20:39.000 { 00:20:39.000 "name": "pt2", 00:20:39.000 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:39.000 "is_configured": true, 00:20:39.000 "data_offset": 2048, 00:20:39.000 "data_size": 63488 00:20:39.000 }, 00:20:39.000 { 00:20:39.001 "name": "pt3", 00:20:39.001 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:39.001 "is_configured": true, 00:20:39.001 "data_offset": 2048, 00:20:39.001 "data_size": 63488 00:20:39.001 }, 00:20:39.001 { 00:20:39.001 "name": "pt4", 00:20:39.001 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:39.001 "is_configured": true, 00:20:39.001 "data_offset": 2048, 00:20:39.001 "data_size": 63488 00:20:39.001 } 00:20:39.001 ] 00:20:39.001 } 00:20:39.001 } 00:20:39.001 }' 00:20:39.001 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:39.259 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:39.259 pt2 00:20:39.260 pt3 00:20:39.260 pt4' 00:20:39.260 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:39.260 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:39.260 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:39.260 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:39.260 "name": "pt1", 00:20:39.260 "aliases": [ 00:20:39.260 "00000000-0000-0000-0000-000000000001" 00:20:39.260 ], 00:20:39.260 "product_name": "passthru", 00:20:39.260 "block_size": 512, 00:20:39.260 "num_blocks": 65536, 00:20:39.260 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:39.260 "assigned_rate_limits": { 00:20:39.260 "rw_ios_per_sec": 0, 00:20:39.260 "rw_mbytes_per_sec": 0, 00:20:39.260 "r_mbytes_per_sec": 0, 00:20:39.260 "w_mbytes_per_sec": 0 00:20:39.260 }, 00:20:39.260 "claimed": true, 00:20:39.260 "claim_type": "exclusive_write", 00:20:39.260 "zoned": false, 00:20:39.260 "supported_io_types": { 00:20:39.260 "read": true, 00:20:39.260 "write": true, 00:20:39.260 "unmap": true, 00:20:39.260 "flush": true, 00:20:39.260 "reset": true, 00:20:39.260 "nvme_admin": false, 00:20:39.260 "nvme_io": false, 00:20:39.260 "nvme_io_md": false, 00:20:39.260 "write_zeroes": true, 00:20:39.260 "zcopy": true, 00:20:39.260 "get_zone_info": false, 00:20:39.260 "zone_management": false, 00:20:39.260 "zone_append": false, 00:20:39.260 "compare": false, 00:20:39.260 "compare_and_write": false, 00:20:39.260 "abort": true, 00:20:39.260 "seek_hole": false, 00:20:39.260 "seek_data": false, 00:20:39.260 "copy": true, 00:20:39.260 "nvme_iov_md": false 00:20:39.260 }, 00:20:39.260 "memory_domains": [ 00:20:39.260 { 00:20:39.260 "dma_device_id": "system", 00:20:39.260 "dma_device_type": 1 00:20:39.260 }, 00:20:39.260 { 00:20:39.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.260 "dma_device_type": 2 00:20:39.260 } 00:20:39.260 ], 00:20:39.260 "driver_specific": { 00:20:39.260 "passthru": { 00:20:39.260 "name": "pt1", 00:20:39.260 "base_bdev_name": "malloc1" 00:20:39.260 } 00:20:39.260 } 00:20:39.260 }' 00:20:39.260 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:39.518 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:39.518 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:39.518 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:39.518 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:39.518 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:39.518 13:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:39.518 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:39.518 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:39.518 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:39.777 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:39.777 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:39.777 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:39.777 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:39.777 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:40.035 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:40.035 "name": "pt2", 00:20:40.035 "aliases": [ 00:20:40.035 "00000000-0000-0000-0000-000000000002" 00:20:40.035 ], 00:20:40.035 "product_name": "passthru", 00:20:40.035 "block_size": 512, 00:20:40.035 "num_blocks": 65536, 00:20:40.035 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:40.035 "assigned_rate_limits": { 00:20:40.035 "rw_ios_per_sec": 0, 00:20:40.035 "rw_mbytes_per_sec": 0, 00:20:40.035 "r_mbytes_per_sec": 0, 00:20:40.035 "w_mbytes_per_sec": 0 00:20:40.035 }, 00:20:40.035 "claimed": true, 00:20:40.035 "claim_type": "exclusive_write", 00:20:40.035 "zoned": false, 00:20:40.035 "supported_io_types": { 00:20:40.035 "read": true, 00:20:40.035 "write": true, 00:20:40.035 "unmap": true, 00:20:40.035 "flush": true, 00:20:40.035 "reset": true, 00:20:40.035 "nvme_admin": false, 00:20:40.035 "nvme_io": false, 00:20:40.035 "nvme_io_md": false, 00:20:40.035 "write_zeroes": true, 00:20:40.035 "zcopy": true, 00:20:40.035 "get_zone_info": false, 00:20:40.035 "zone_management": false, 00:20:40.035 "zone_append": false, 00:20:40.035 "compare": false, 00:20:40.035 "compare_and_write": false, 00:20:40.035 "abort": true, 00:20:40.035 "seek_hole": false, 00:20:40.035 "seek_data": false, 00:20:40.035 "copy": true, 00:20:40.035 "nvme_iov_md": false 00:20:40.035 }, 00:20:40.035 "memory_domains": [ 00:20:40.036 { 00:20:40.036 "dma_device_id": "system", 00:20:40.036 "dma_device_type": 1 00:20:40.036 }, 00:20:40.036 { 00:20:40.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.036 "dma_device_type": 2 00:20:40.036 } 00:20:40.036 ], 00:20:40.036 "driver_specific": { 00:20:40.036 "passthru": { 00:20:40.036 "name": "pt2", 00:20:40.036 "base_bdev_name": "malloc2" 00:20:40.036 } 00:20:40.036 } 00:20:40.036 }' 00:20:40.036 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:40.036 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:40.036 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:40.036 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:40.293 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:40.293 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:40.293 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:40.293 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:40.293 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:40.293 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:40.551 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:40.551 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:40.551 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:40.551 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:40.551 13:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:40.809 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:40.809 "name": "pt3", 00:20:40.809 "aliases": [ 00:20:40.809 "00000000-0000-0000-0000-000000000003" 00:20:40.809 ], 00:20:40.809 "product_name": "passthru", 00:20:40.809 "block_size": 512, 00:20:40.809 "num_blocks": 65536, 00:20:40.809 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:40.809 "assigned_rate_limits": { 00:20:40.809 "rw_ios_per_sec": 0, 00:20:40.809 "rw_mbytes_per_sec": 0, 00:20:40.809 "r_mbytes_per_sec": 0, 00:20:40.809 "w_mbytes_per_sec": 0 00:20:40.809 }, 00:20:40.809 "claimed": true, 00:20:40.809 "claim_type": "exclusive_write", 00:20:40.809 "zoned": false, 00:20:40.809 "supported_io_types": { 00:20:40.809 "read": true, 00:20:40.809 "write": true, 00:20:40.809 "unmap": true, 00:20:40.809 "flush": true, 00:20:40.809 "reset": true, 00:20:40.809 "nvme_admin": false, 00:20:40.809 "nvme_io": false, 00:20:40.809 "nvme_io_md": false, 00:20:40.809 "write_zeroes": true, 00:20:40.809 "zcopy": true, 00:20:40.809 "get_zone_info": false, 00:20:40.809 "zone_management": false, 00:20:40.809 "zone_append": false, 00:20:40.809 "compare": false, 00:20:40.809 "compare_and_write": false, 00:20:40.809 "abort": true, 00:20:40.809 "seek_hole": false, 00:20:40.809 "seek_data": false, 00:20:40.809 "copy": true, 00:20:40.809 "nvme_iov_md": false 00:20:40.809 }, 00:20:40.809 "memory_domains": [ 00:20:40.809 { 00:20:40.809 "dma_device_id": "system", 00:20:40.809 "dma_device_type": 1 00:20:40.809 }, 00:20:40.809 { 00:20:40.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.809 "dma_device_type": 2 00:20:40.809 } 00:20:40.809 ], 00:20:40.809 "driver_specific": { 00:20:40.809 "passthru": { 00:20:40.809 "name": "pt3", 00:20:40.809 "base_bdev_name": "malloc3" 00:20:40.809 } 00:20:40.809 } 00:20:40.809 }' 00:20:40.809 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:40.809 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:40.809 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:40.809 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:40.809 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:41.067 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:41.067 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:41.067 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:41.067 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:41.067 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:41.067 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:41.067 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:41.067 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:41.067 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:41.067 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:41.326 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:41.326 "name": "pt4", 00:20:41.326 "aliases": [ 00:20:41.326 "00000000-0000-0000-0000-000000000004" 00:20:41.326 ], 00:20:41.326 "product_name": "passthru", 00:20:41.326 "block_size": 512, 00:20:41.326 "num_blocks": 65536, 00:20:41.326 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:41.326 "assigned_rate_limits": { 00:20:41.326 "rw_ios_per_sec": 0, 00:20:41.326 "rw_mbytes_per_sec": 0, 00:20:41.326 "r_mbytes_per_sec": 0, 00:20:41.326 "w_mbytes_per_sec": 0 00:20:41.326 }, 00:20:41.326 "claimed": true, 00:20:41.326 "claim_type": "exclusive_write", 00:20:41.326 "zoned": false, 00:20:41.326 "supported_io_types": { 00:20:41.326 "read": true, 00:20:41.326 "write": true, 00:20:41.326 "unmap": true, 00:20:41.326 "flush": true, 00:20:41.326 "reset": true, 00:20:41.326 "nvme_admin": false, 00:20:41.326 "nvme_io": false, 00:20:41.326 "nvme_io_md": false, 00:20:41.326 "write_zeroes": true, 00:20:41.326 "zcopy": true, 00:20:41.326 "get_zone_info": false, 00:20:41.326 "zone_management": false, 00:20:41.326 "zone_append": false, 00:20:41.326 "compare": false, 00:20:41.326 "compare_and_write": false, 00:20:41.326 "abort": true, 00:20:41.326 "seek_hole": false, 00:20:41.326 "seek_data": false, 00:20:41.326 "copy": true, 00:20:41.326 "nvme_iov_md": false 00:20:41.326 }, 00:20:41.326 "memory_domains": [ 00:20:41.326 { 00:20:41.326 "dma_device_id": "system", 00:20:41.326 "dma_device_type": 1 00:20:41.326 }, 00:20:41.326 { 00:20:41.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:41.326 "dma_device_type": 2 00:20:41.326 } 00:20:41.326 ], 00:20:41.326 "driver_specific": { 00:20:41.326 "passthru": { 00:20:41.326 "name": "pt4", 00:20:41.326 "base_bdev_name": "malloc4" 00:20:41.326 } 00:20:41.326 } 00:20:41.326 }' 00:20:41.326 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:41.584 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:41.584 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:41.584 13:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:41.584 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:41.584 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:41.584 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:41.584 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:41.843 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:41.843 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:41.843 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:41.843 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:41.843 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:41.843 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:20:42.102 [2024-07-12 13:46:30.488248] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 29d5b57f-75d0-494b-a44e-7f5e90d2a4ff '!=' 29d5b57f-75d0-494b-a44e-7f5e90d2a4ff ']' 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 515867 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 515867 ']' 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 515867 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 515867 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 515867' 00:20:42.102 killing process with pid 515867 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 515867 00:20:42.102 [2024-07-12 13:46:30.564364] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:42.102 [2024-07-12 13:46:30.564424] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:42.102 [2024-07-12 13:46:30.564485] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:42.102 [2024-07-12 13:46:30.564499] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20e2f40 name raid_bdev1, state offline 00:20:42.102 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 515867 00:20:42.102 [2024-07-12 13:46:30.603092] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:42.361 13:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:42.361 00:20:42.361 real 0m17.087s 00:20:42.361 user 0m30.959s 00:20:42.361 sys 0m2.970s 00:20:42.361 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:42.361 13:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.362 ************************************ 00:20:42.362 END TEST raid_superblock_test 00:20:42.362 ************************************ 00:20:42.362 13:46:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:42.362 13:46:30 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:20:42.362 13:46:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:42.362 13:46:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:42.362 13:46:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:42.362 ************************************ 00:20:42.362 START TEST raid_read_error_test 00:20:42.362 ************************************ 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.QIWkLB7K0C 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=518408 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 518408 /var/tmp/spdk-raid.sock 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 518408 ']' 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:42.362 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:42.362 13:46:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.621 [2024-07-12 13:46:30.981326] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:20:42.621 [2024-07-12 13:46:30.981391] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid518408 ] 00:20:42.621 [2024-07-12 13:46:31.107381] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:42.879 [2024-07-12 13:46:31.210381] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:42.879 [2024-07-12 13:46:31.270804] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:42.879 [2024-07-12 13:46:31.270859] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:43.816 13:46:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:43.816 13:46:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:43.816 13:46:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:43.816 13:46:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:44.384 BaseBdev1_malloc 00:20:44.384 13:46:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:44.641 true 00:20:44.641 13:46:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:45.208 [2024-07-12 13:46:33.685773] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:45.208 [2024-07-12 13:46:33.685818] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:45.208 [2024-07-12 13:46:33.685839] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1feba10 00:20:45.208 [2024-07-12 13:46:33.685852] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:45.208 [2024-07-12 13:46:33.687766] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:45.208 [2024-07-12 13:46:33.687795] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:45.208 BaseBdev1 00:20:45.208 13:46:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:45.208 13:46:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:45.775 BaseBdev2_malloc 00:20:45.775 13:46:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:46.033 true 00:20:46.033 13:46:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:46.600 [2024-07-12 13:46:34.954950] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:46.600 [2024-07-12 13:46:34.954997] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:46.600 [2024-07-12 13:46:34.955018] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff0250 00:20:46.600 [2024-07-12 13:46:34.955030] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:46.600 [2024-07-12 13:46:34.956646] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:46.600 [2024-07-12 13:46:34.956674] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:46.600 BaseBdev2 00:20:46.600 13:46:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:46.600 13:46:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:47.168 BaseBdev3_malloc 00:20:47.168 13:46:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:47.427 true 00:20:47.686 13:46:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:47.944 [2024-07-12 13:46:36.496833] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:47.944 [2024-07-12 13:46:36.496877] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.944 [2024-07-12 13:46:36.496898] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff2510 00:20:47.944 [2024-07-12 13:46:36.496910] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.944 [2024-07-12 13:46:36.498483] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.944 [2024-07-12 13:46:36.498510] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:47.944 BaseBdev3 00:20:48.203 13:46:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:48.203 13:46:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:48.203 BaseBdev4_malloc 00:20:48.203 13:46:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:48.461 true 00:20:48.461 13:46:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:49.029 [2024-07-12 13:46:37.524040] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:49.029 [2024-07-12 13:46:37.524084] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:49.029 [2024-07-12 13:46:37.524106] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff33e0 00:20:49.029 [2024-07-12 13:46:37.524124] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:49.029 [2024-07-12 13:46:37.525722] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:49.029 [2024-07-12 13:46:37.525749] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:49.029 BaseBdev4 00:20:49.030 13:46:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:49.598 [2024-07-12 13:46:38.033420] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:49.598 [2024-07-12 13:46:38.034759] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:49.598 [2024-07-12 13:46:38.034827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:49.598 [2024-07-12 13:46:38.034888] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:49.598 [2024-07-12 13:46:38.035122] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fed560 00:20:49.598 [2024-07-12 13:46:38.035134] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:49.598 [2024-07-12 13:46:38.035334] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e41ba0 00:20:49.598 [2024-07-12 13:46:38.035484] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fed560 00:20:49.598 [2024-07-12 13:46:38.035494] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fed560 00:20:49.598 [2024-07-12 13:46:38.035598] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:49.598 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.166 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.166 "name": "raid_bdev1", 00:20:50.166 "uuid": "20480ded-85b1-4d96-af93-614fd7ae737f", 00:20:50.166 "strip_size_kb": 64, 00:20:50.166 "state": "online", 00:20:50.166 "raid_level": "concat", 00:20:50.166 "superblock": true, 00:20:50.166 "num_base_bdevs": 4, 00:20:50.166 "num_base_bdevs_discovered": 4, 00:20:50.166 "num_base_bdevs_operational": 4, 00:20:50.166 "base_bdevs_list": [ 00:20:50.166 { 00:20:50.166 "name": "BaseBdev1", 00:20:50.166 "uuid": "c0d9734c-3ba7-5d4e-af7e-f2832b8fe8a6", 00:20:50.166 "is_configured": true, 00:20:50.166 "data_offset": 2048, 00:20:50.166 "data_size": 63488 00:20:50.166 }, 00:20:50.166 { 00:20:50.166 "name": "BaseBdev2", 00:20:50.166 "uuid": "9d639268-7fd1-53e1-8a01-0b6433b348c3", 00:20:50.166 "is_configured": true, 00:20:50.166 "data_offset": 2048, 00:20:50.166 "data_size": 63488 00:20:50.166 }, 00:20:50.166 { 00:20:50.166 "name": "BaseBdev3", 00:20:50.166 "uuid": "578512a2-a155-5d65-81e2-a08aaae6d665", 00:20:50.166 "is_configured": true, 00:20:50.166 "data_offset": 2048, 00:20:50.166 "data_size": 63488 00:20:50.166 }, 00:20:50.166 { 00:20:50.166 "name": "BaseBdev4", 00:20:50.166 "uuid": "7f2683fe-a197-53e5-95dc-c8b7267e6b6f", 00:20:50.166 "is_configured": true, 00:20:50.166 "data_offset": 2048, 00:20:50.166 "data_size": 63488 00:20:50.166 } 00:20:50.166 ] 00:20:50.166 }' 00:20:50.166 13:46:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.166 13:46:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.734 13:46:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:50.734 13:46:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:50.734 [2024-07-12 13:46:39.256938] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fdf900 00:20:51.671 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.931 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:52.191 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:52.191 "name": "raid_bdev1", 00:20:52.191 "uuid": "20480ded-85b1-4d96-af93-614fd7ae737f", 00:20:52.191 "strip_size_kb": 64, 00:20:52.191 "state": "online", 00:20:52.191 "raid_level": "concat", 00:20:52.191 "superblock": true, 00:20:52.191 "num_base_bdevs": 4, 00:20:52.191 "num_base_bdevs_discovered": 4, 00:20:52.191 "num_base_bdevs_operational": 4, 00:20:52.191 "base_bdevs_list": [ 00:20:52.191 { 00:20:52.191 "name": "BaseBdev1", 00:20:52.191 "uuid": "c0d9734c-3ba7-5d4e-af7e-f2832b8fe8a6", 00:20:52.191 "is_configured": true, 00:20:52.191 "data_offset": 2048, 00:20:52.191 "data_size": 63488 00:20:52.191 }, 00:20:52.191 { 00:20:52.191 "name": "BaseBdev2", 00:20:52.191 "uuid": "9d639268-7fd1-53e1-8a01-0b6433b348c3", 00:20:52.191 "is_configured": true, 00:20:52.191 "data_offset": 2048, 00:20:52.191 "data_size": 63488 00:20:52.191 }, 00:20:52.191 { 00:20:52.191 "name": "BaseBdev3", 00:20:52.191 "uuid": "578512a2-a155-5d65-81e2-a08aaae6d665", 00:20:52.191 "is_configured": true, 00:20:52.191 "data_offset": 2048, 00:20:52.191 "data_size": 63488 00:20:52.191 }, 00:20:52.191 { 00:20:52.191 "name": "BaseBdev4", 00:20:52.191 "uuid": "7f2683fe-a197-53e5-95dc-c8b7267e6b6f", 00:20:52.191 "is_configured": true, 00:20:52.191 "data_offset": 2048, 00:20:52.191 "data_size": 63488 00:20:52.191 } 00:20:52.191 ] 00:20:52.191 }' 00:20:52.191 13:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:52.191 13:46:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:53.129 13:46:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:53.389 [2024-07-12 13:46:41.730354] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:53.389 [2024-07-12 13:46:41.730387] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:53.389 [2024-07-12 13:46:41.733555] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:53.389 [2024-07-12 13:46:41.733594] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:53.389 [2024-07-12 13:46:41.733635] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:53.389 [2024-07-12 13:46:41.733646] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fed560 name raid_bdev1, state offline 00:20:53.389 0 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 518408 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 518408 ']' 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 518408 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 518408 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 518408' 00:20:53.389 killing process with pid 518408 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 518408 00:20:53.389 [2024-07-12 13:46:41.800492] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:53.389 13:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 518408 00:20:53.389 [2024-07-12 13:46:41.832931] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:53.649 13:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.QIWkLB7K0C 00:20:53.649 13:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:53.649 13:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:53.649 13:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.41 00:20:53.649 13:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:53.649 13:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:53.649 13:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:53.649 13:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.41 != \0\.\0\0 ]] 00:20:53.649 00:20:53.649 real 0m11.169s 00:20:53.649 user 0m18.721s 00:20:53.649 sys 0m1.821s 00:20:53.649 13:46:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:53.649 13:46:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:53.649 ************************************ 00:20:53.649 END TEST raid_read_error_test 00:20:53.649 ************************************ 00:20:53.649 13:46:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:53.649 13:46:42 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:20:53.649 13:46:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:53.649 13:46:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:53.649 13:46:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:53.649 ************************************ 00:20:53.649 START TEST raid_write_error_test 00:20:53.649 ************************************ 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:53.649 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.qaGXCYh2Bx 00:20:53.650 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:53.650 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=519912 00:20:53.650 13:46:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 519912 /var/tmp/spdk-raid.sock 00:20:53.650 13:46:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 519912 ']' 00:20:53.650 13:46:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:53.650 13:46:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:53.650 13:46:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:53.650 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:53.650 13:46:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:53.650 13:46:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:53.650 [2024-07-12 13:46:42.226962] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:20:53.650 [2024-07-12 13:46:42.227015] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid519912 ] 00:20:53.909 [2024-07-12 13:46:42.342161] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:53.909 [2024-07-12 13:46:42.450624] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:54.168 [2024-07-12 13:46:42.512906] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:54.168 [2024-07-12 13:46:42.512956] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:54.738 13:46:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:54.738 13:46:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:54.738 13:46:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:54.738 13:46:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:54.997 BaseBdev1_malloc 00:20:54.997 13:46:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:55.255 true 00:20:55.255 13:46:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:55.255 [2024-07-12 13:46:43.823179] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:55.255 [2024-07-12 13:46:43.823225] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:55.255 [2024-07-12 13:46:43.823244] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1162a10 00:20:55.255 [2024-07-12 13:46:43.823257] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:55.255 [2024-07-12 13:46:43.824969] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:55.255 [2024-07-12 13:46:43.824996] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:55.255 BaseBdev1 00:20:55.514 13:46:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:55.514 13:46:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:55.514 BaseBdev2_malloc 00:20:55.514 13:46:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:55.773 true 00:20:55.773 13:46:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:56.031 [2024-07-12 13:46:44.513550] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:56.031 [2024-07-12 13:46:44.513591] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:56.031 [2024-07-12 13:46:44.513610] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1167250 00:20:56.031 [2024-07-12 13:46:44.513623] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:56.031 [2024-07-12 13:46:44.515037] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:56.031 [2024-07-12 13:46:44.515065] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:56.031 BaseBdev2 00:20:56.031 13:46:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:56.031 13:46:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:56.290 BaseBdev3_malloc 00:20:56.290 13:46:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:56.550 true 00:20:56.550 13:46:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:56.809 [2024-07-12 13:46:45.264125] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:56.809 [2024-07-12 13:46:45.264175] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:56.809 [2024-07-12 13:46:45.264197] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1169510 00:20:56.809 [2024-07-12 13:46:45.264210] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:56.809 [2024-07-12 13:46:45.265854] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:56.809 [2024-07-12 13:46:45.265884] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:56.809 BaseBdev3 00:20:56.809 13:46:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:56.809 13:46:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:57.067 BaseBdev4_malloc 00:20:57.067 13:46:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:57.326 true 00:20:57.326 13:46:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:57.585 [2024-07-12 13:46:46.014765] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:57.585 [2024-07-12 13:46:46.014817] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:57.585 [2024-07-12 13:46:46.014839] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116a3e0 00:20:57.585 [2024-07-12 13:46:46.014852] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:57.585 [2024-07-12 13:46:46.016482] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:57.585 [2024-07-12 13:46:46.016513] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:57.585 BaseBdev4 00:20:57.585 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:57.845 [2024-07-12 13:46:46.247420] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:57.845 [2024-07-12 13:46:46.248783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:57.845 [2024-07-12 13:46:46.248853] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:57.845 [2024-07-12 13:46:46.248917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:57.845 [2024-07-12 13:46:46.249160] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1164560 00:20:57.845 [2024-07-12 13:46:46.249174] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:57.845 [2024-07-12 13:46:46.249381] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfb8ba0 00:20:57.845 [2024-07-12 13:46:46.249534] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1164560 00:20:57.845 [2024-07-12 13:46:46.249544] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1164560 00:20:57.845 [2024-07-12 13:46:46.249649] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.845 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.105 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:58.105 "name": "raid_bdev1", 00:20:58.105 "uuid": "f6017e2a-a715-4885-b6a4-b82b0af3de56", 00:20:58.105 "strip_size_kb": 64, 00:20:58.105 "state": "online", 00:20:58.105 "raid_level": "concat", 00:20:58.105 "superblock": true, 00:20:58.105 "num_base_bdevs": 4, 00:20:58.105 "num_base_bdevs_discovered": 4, 00:20:58.105 "num_base_bdevs_operational": 4, 00:20:58.105 "base_bdevs_list": [ 00:20:58.105 { 00:20:58.105 "name": "BaseBdev1", 00:20:58.105 "uuid": "c2729b19-2a77-5b98-9cea-71059d3302e9", 00:20:58.105 "is_configured": true, 00:20:58.105 "data_offset": 2048, 00:20:58.105 "data_size": 63488 00:20:58.105 }, 00:20:58.105 { 00:20:58.105 "name": "BaseBdev2", 00:20:58.105 "uuid": "d0ccfb56-aaa2-553e-8cf5-fae3e99462e1", 00:20:58.105 "is_configured": true, 00:20:58.105 "data_offset": 2048, 00:20:58.105 "data_size": 63488 00:20:58.105 }, 00:20:58.105 { 00:20:58.105 "name": "BaseBdev3", 00:20:58.105 "uuid": "59ada4dd-a8ed-5087-bcd6-8a352acd730c", 00:20:58.105 "is_configured": true, 00:20:58.105 "data_offset": 2048, 00:20:58.105 "data_size": 63488 00:20:58.105 }, 00:20:58.105 { 00:20:58.105 "name": "BaseBdev4", 00:20:58.105 "uuid": "51e8cd76-c258-5c31-b223-464bade24016", 00:20:58.105 "is_configured": true, 00:20:58.105 "data_offset": 2048, 00:20:58.105 "data_size": 63488 00:20:58.105 } 00:20:58.105 ] 00:20:58.105 }' 00:20:58.105 13:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:58.105 13:46:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:58.672 13:46:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:58.672 13:46:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:58.672 [2024-07-12 13:46:47.202231] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1156900 00:20:59.611 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.871 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:00.154 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.154 "name": "raid_bdev1", 00:21:00.154 "uuid": "f6017e2a-a715-4885-b6a4-b82b0af3de56", 00:21:00.154 "strip_size_kb": 64, 00:21:00.154 "state": "online", 00:21:00.154 "raid_level": "concat", 00:21:00.154 "superblock": true, 00:21:00.154 "num_base_bdevs": 4, 00:21:00.154 "num_base_bdevs_discovered": 4, 00:21:00.154 "num_base_bdevs_operational": 4, 00:21:00.154 "base_bdevs_list": [ 00:21:00.154 { 00:21:00.154 "name": "BaseBdev1", 00:21:00.154 "uuid": "c2729b19-2a77-5b98-9cea-71059d3302e9", 00:21:00.154 "is_configured": true, 00:21:00.154 "data_offset": 2048, 00:21:00.154 "data_size": 63488 00:21:00.154 }, 00:21:00.154 { 00:21:00.154 "name": "BaseBdev2", 00:21:00.154 "uuid": "d0ccfb56-aaa2-553e-8cf5-fae3e99462e1", 00:21:00.154 "is_configured": true, 00:21:00.154 "data_offset": 2048, 00:21:00.154 "data_size": 63488 00:21:00.154 }, 00:21:00.154 { 00:21:00.154 "name": "BaseBdev3", 00:21:00.154 "uuid": "59ada4dd-a8ed-5087-bcd6-8a352acd730c", 00:21:00.154 "is_configured": true, 00:21:00.154 "data_offset": 2048, 00:21:00.154 "data_size": 63488 00:21:00.154 }, 00:21:00.154 { 00:21:00.154 "name": "BaseBdev4", 00:21:00.154 "uuid": "51e8cd76-c258-5c31-b223-464bade24016", 00:21:00.154 "is_configured": true, 00:21:00.154 "data_offset": 2048, 00:21:00.154 "data_size": 63488 00:21:00.154 } 00:21:00.154 ] 00:21:00.154 }' 00:21:00.154 13:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.154 13:46:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:00.719 13:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:00.719 [2024-07-12 13:46:49.264907] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:00.719 [2024-07-12 13:46:49.264957] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:00.719 [2024-07-12 13:46:49.268133] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:00.719 [2024-07-12 13:46:49.268174] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:00.719 [2024-07-12 13:46:49.268214] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:00.719 [2024-07-12 13:46:49.268225] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1164560 name raid_bdev1, state offline 00:21:00.719 0 00:21:00.719 13:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 519912 00:21:00.719 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 519912 ']' 00:21:00.719 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 519912 00:21:00.719 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:21:00.719 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:00.719 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 519912 00:21:00.977 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:00.977 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:00.977 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 519912' 00:21:00.977 killing process with pid 519912 00:21:00.977 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 519912 00:21:00.977 [2024-07-12 13:46:49.339646] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:00.977 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 519912 00:21:00.977 [2024-07-12 13:46:49.371308] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:01.236 13:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.qaGXCYh2Bx 00:21:01.236 13:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:01.236 13:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:01.236 13:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:21:01.236 13:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:01.236 13:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:01.236 13:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:01.236 13:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:21:01.236 00:21:01.236 real 0m7.444s 00:21:01.236 user 0m11.813s 00:21:01.236 sys 0m1.354s 00:21:01.236 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:01.236 13:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:01.236 ************************************ 00:21:01.236 END TEST raid_write_error_test 00:21:01.236 ************************************ 00:21:01.236 13:46:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:01.236 13:46:49 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:21:01.236 13:46:49 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:21:01.236 13:46:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:01.236 13:46:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:01.236 13:46:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:01.236 ************************************ 00:21:01.236 START TEST raid_state_function_test 00:21:01.236 ************************************ 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=521050 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 521050' 00:21:01.236 Process raid pid: 521050 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 521050 /var/tmp/spdk-raid.sock 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 521050 ']' 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:01.236 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:01.236 13:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:01.236 [2024-07-12 13:46:49.773738] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:21:01.236 [2024-07-12 13:46:49.773809] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:01.494 [2024-07-12 13:46:49.905463] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:01.494 [2024-07-12 13:46:50.008897] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:01.494 [2024-07-12 13:46:50.066789] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:01.494 [2024-07-12 13:46:50.066821] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:02.428 [2024-07-12 13:46:50.942380] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:02.428 [2024-07-12 13:46:50.942426] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:02.428 [2024-07-12 13:46:50.942436] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:02.428 [2024-07-12 13:46:50.942449] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:02.428 [2024-07-12 13:46:50.942458] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:02.428 [2024-07-12 13:46:50.942469] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:02.428 [2024-07-12 13:46:50.942478] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:02.428 [2024-07-12 13:46:50.942489] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.428 13:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:02.687 13:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.687 "name": "Existed_Raid", 00:21:02.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.687 "strip_size_kb": 0, 00:21:02.687 "state": "configuring", 00:21:02.687 "raid_level": "raid1", 00:21:02.687 "superblock": false, 00:21:02.687 "num_base_bdevs": 4, 00:21:02.687 "num_base_bdevs_discovered": 0, 00:21:02.687 "num_base_bdevs_operational": 4, 00:21:02.687 "base_bdevs_list": [ 00:21:02.687 { 00:21:02.687 "name": "BaseBdev1", 00:21:02.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.687 "is_configured": false, 00:21:02.687 "data_offset": 0, 00:21:02.687 "data_size": 0 00:21:02.687 }, 00:21:02.687 { 00:21:02.687 "name": "BaseBdev2", 00:21:02.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.687 "is_configured": false, 00:21:02.687 "data_offset": 0, 00:21:02.687 "data_size": 0 00:21:02.687 }, 00:21:02.687 { 00:21:02.687 "name": "BaseBdev3", 00:21:02.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.687 "is_configured": false, 00:21:02.687 "data_offset": 0, 00:21:02.687 "data_size": 0 00:21:02.687 }, 00:21:02.687 { 00:21:02.687 "name": "BaseBdev4", 00:21:02.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.687 "is_configured": false, 00:21:02.687 "data_offset": 0, 00:21:02.687 "data_size": 0 00:21:02.687 } 00:21:02.687 ] 00:21:02.687 }' 00:21:02.687 13:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.687 13:46:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:03.252 13:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:03.512 [2024-07-12 13:46:52.057196] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:03.512 [2024-07-12 13:46:52.057230] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1025370 name Existed_Raid, state configuring 00:21:03.512 13:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:03.771 [2024-07-12 13:46:52.301859] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:03.771 [2024-07-12 13:46:52.301893] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:03.771 [2024-07-12 13:46:52.301903] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:03.771 [2024-07-12 13:46:52.301914] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:03.771 [2024-07-12 13:46:52.301923] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:03.771 [2024-07-12 13:46:52.301941] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:03.771 [2024-07-12 13:46:52.301950] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:03.771 [2024-07-12 13:46:52.301961] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:03.771 13:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:04.031 [2024-07-12 13:46:52.545649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:04.031 BaseBdev1 00:21:04.031 13:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:04.031 13:46:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:04.031 13:46:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:04.031 13:46:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:04.031 13:46:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:04.031 13:46:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:04.031 13:46:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:04.290 13:46:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:04.549 [ 00:21:04.549 { 00:21:04.549 "name": "BaseBdev1", 00:21:04.549 "aliases": [ 00:21:04.549 "0f3fd757-670c-43c5-9bad-d3490bb7f62c" 00:21:04.549 ], 00:21:04.549 "product_name": "Malloc disk", 00:21:04.549 "block_size": 512, 00:21:04.549 "num_blocks": 65536, 00:21:04.549 "uuid": "0f3fd757-670c-43c5-9bad-d3490bb7f62c", 00:21:04.549 "assigned_rate_limits": { 00:21:04.549 "rw_ios_per_sec": 0, 00:21:04.549 "rw_mbytes_per_sec": 0, 00:21:04.549 "r_mbytes_per_sec": 0, 00:21:04.549 "w_mbytes_per_sec": 0 00:21:04.549 }, 00:21:04.549 "claimed": true, 00:21:04.549 "claim_type": "exclusive_write", 00:21:04.549 "zoned": false, 00:21:04.549 "supported_io_types": { 00:21:04.549 "read": true, 00:21:04.549 "write": true, 00:21:04.549 "unmap": true, 00:21:04.549 "flush": true, 00:21:04.549 "reset": true, 00:21:04.549 "nvme_admin": false, 00:21:04.549 "nvme_io": false, 00:21:04.549 "nvme_io_md": false, 00:21:04.549 "write_zeroes": true, 00:21:04.549 "zcopy": true, 00:21:04.549 "get_zone_info": false, 00:21:04.549 "zone_management": false, 00:21:04.549 "zone_append": false, 00:21:04.549 "compare": false, 00:21:04.549 "compare_and_write": false, 00:21:04.549 "abort": true, 00:21:04.549 "seek_hole": false, 00:21:04.549 "seek_data": false, 00:21:04.549 "copy": true, 00:21:04.549 "nvme_iov_md": false 00:21:04.549 }, 00:21:04.549 "memory_domains": [ 00:21:04.549 { 00:21:04.549 "dma_device_id": "system", 00:21:04.549 "dma_device_type": 1 00:21:04.549 }, 00:21:04.549 { 00:21:04.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:04.550 "dma_device_type": 2 00:21:04.550 } 00:21:04.550 ], 00:21:04.550 "driver_specific": {} 00:21:04.550 } 00:21:04.550 ] 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.550 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:04.809 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.809 "name": "Existed_Raid", 00:21:04.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.809 "strip_size_kb": 0, 00:21:04.809 "state": "configuring", 00:21:04.809 "raid_level": "raid1", 00:21:04.809 "superblock": false, 00:21:04.809 "num_base_bdevs": 4, 00:21:04.809 "num_base_bdevs_discovered": 1, 00:21:04.809 "num_base_bdevs_operational": 4, 00:21:04.809 "base_bdevs_list": [ 00:21:04.809 { 00:21:04.809 "name": "BaseBdev1", 00:21:04.809 "uuid": "0f3fd757-670c-43c5-9bad-d3490bb7f62c", 00:21:04.809 "is_configured": true, 00:21:04.809 "data_offset": 0, 00:21:04.809 "data_size": 65536 00:21:04.809 }, 00:21:04.809 { 00:21:04.809 "name": "BaseBdev2", 00:21:04.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.809 "is_configured": false, 00:21:04.809 "data_offset": 0, 00:21:04.809 "data_size": 0 00:21:04.809 }, 00:21:04.809 { 00:21:04.809 "name": "BaseBdev3", 00:21:04.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.809 "is_configured": false, 00:21:04.809 "data_offset": 0, 00:21:04.809 "data_size": 0 00:21:04.809 }, 00:21:04.809 { 00:21:04.809 "name": "BaseBdev4", 00:21:04.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.809 "is_configured": false, 00:21:04.809 "data_offset": 0, 00:21:04.809 "data_size": 0 00:21:04.809 } 00:21:04.809 ] 00:21:04.809 }' 00:21:04.809 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.809 13:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:05.377 13:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:05.636 [2024-07-12 13:46:54.105807] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:05.636 [2024-07-12 13:46:54.105853] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1024be0 name Existed_Raid, state configuring 00:21:05.636 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:05.896 [2024-07-12 13:46:54.338453] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:05.896 [2024-07-12 13:46:54.339939] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:05.896 [2024-07-12 13:46:54.339973] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:05.896 [2024-07-12 13:46:54.339984] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:05.896 [2024-07-12 13:46:54.339996] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:05.896 [2024-07-12 13:46:54.340005] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:05.896 [2024-07-12 13:46:54.340016] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.896 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:06.155 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:06.155 "name": "Existed_Raid", 00:21:06.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:06.155 "strip_size_kb": 0, 00:21:06.155 "state": "configuring", 00:21:06.155 "raid_level": "raid1", 00:21:06.155 "superblock": false, 00:21:06.155 "num_base_bdevs": 4, 00:21:06.155 "num_base_bdevs_discovered": 1, 00:21:06.155 "num_base_bdevs_operational": 4, 00:21:06.155 "base_bdevs_list": [ 00:21:06.155 { 00:21:06.155 "name": "BaseBdev1", 00:21:06.155 "uuid": "0f3fd757-670c-43c5-9bad-d3490bb7f62c", 00:21:06.155 "is_configured": true, 00:21:06.155 "data_offset": 0, 00:21:06.155 "data_size": 65536 00:21:06.155 }, 00:21:06.155 { 00:21:06.155 "name": "BaseBdev2", 00:21:06.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:06.155 "is_configured": false, 00:21:06.155 "data_offset": 0, 00:21:06.155 "data_size": 0 00:21:06.155 }, 00:21:06.155 { 00:21:06.155 "name": "BaseBdev3", 00:21:06.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:06.155 "is_configured": false, 00:21:06.155 "data_offset": 0, 00:21:06.155 "data_size": 0 00:21:06.155 }, 00:21:06.155 { 00:21:06.155 "name": "BaseBdev4", 00:21:06.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:06.155 "is_configured": false, 00:21:06.155 "data_offset": 0, 00:21:06.155 "data_size": 0 00:21:06.155 } 00:21:06.155 ] 00:21:06.155 }' 00:21:06.155 13:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:06.155 13:46:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:06.723 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:06.982 [2024-07-12 13:46:55.368576] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:06.982 BaseBdev2 00:21:06.982 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:06.982 13:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:06.982 13:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:06.982 13:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:06.982 13:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:06.982 13:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:06.982 13:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:06.982 13:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:07.242 [ 00:21:07.242 { 00:21:07.242 "name": "BaseBdev2", 00:21:07.242 "aliases": [ 00:21:07.242 "a774d576-55da-40c6-a191-778df05b5f30" 00:21:07.242 ], 00:21:07.242 "product_name": "Malloc disk", 00:21:07.242 "block_size": 512, 00:21:07.242 "num_blocks": 65536, 00:21:07.242 "uuid": "a774d576-55da-40c6-a191-778df05b5f30", 00:21:07.242 "assigned_rate_limits": { 00:21:07.242 "rw_ios_per_sec": 0, 00:21:07.242 "rw_mbytes_per_sec": 0, 00:21:07.242 "r_mbytes_per_sec": 0, 00:21:07.242 "w_mbytes_per_sec": 0 00:21:07.242 }, 00:21:07.242 "claimed": true, 00:21:07.242 "claim_type": "exclusive_write", 00:21:07.242 "zoned": false, 00:21:07.242 "supported_io_types": { 00:21:07.242 "read": true, 00:21:07.242 "write": true, 00:21:07.242 "unmap": true, 00:21:07.242 "flush": true, 00:21:07.242 "reset": true, 00:21:07.242 "nvme_admin": false, 00:21:07.242 "nvme_io": false, 00:21:07.242 "nvme_io_md": false, 00:21:07.242 "write_zeroes": true, 00:21:07.242 "zcopy": true, 00:21:07.242 "get_zone_info": false, 00:21:07.242 "zone_management": false, 00:21:07.242 "zone_append": false, 00:21:07.242 "compare": false, 00:21:07.242 "compare_and_write": false, 00:21:07.242 "abort": true, 00:21:07.242 "seek_hole": false, 00:21:07.242 "seek_data": false, 00:21:07.242 "copy": true, 00:21:07.242 "nvme_iov_md": false 00:21:07.242 }, 00:21:07.242 "memory_domains": [ 00:21:07.242 { 00:21:07.242 "dma_device_id": "system", 00:21:07.242 "dma_device_type": 1 00:21:07.242 }, 00:21:07.242 { 00:21:07.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.242 "dma_device_type": 2 00:21:07.242 } 00:21:07.242 ], 00:21:07.242 "driver_specific": {} 00:21:07.242 } 00:21:07.242 ] 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.242 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:07.501 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:07.501 "name": "Existed_Raid", 00:21:07.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.501 "strip_size_kb": 0, 00:21:07.501 "state": "configuring", 00:21:07.501 "raid_level": "raid1", 00:21:07.501 "superblock": false, 00:21:07.501 "num_base_bdevs": 4, 00:21:07.501 "num_base_bdevs_discovered": 2, 00:21:07.501 "num_base_bdevs_operational": 4, 00:21:07.501 "base_bdevs_list": [ 00:21:07.501 { 00:21:07.501 "name": "BaseBdev1", 00:21:07.501 "uuid": "0f3fd757-670c-43c5-9bad-d3490bb7f62c", 00:21:07.501 "is_configured": true, 00:21:07.501 "data_offset": 0, 00:21:07.501 "data_size": 65536 00:21:07.501 }, 00:21:07.501 { 00:21:07.501 "name": "BaseBdev2", 00:21:07.501 "uuid": "a774d576-55da-40c6-a191-778df05b5f30", 00:21:07.501 "is_configured": true, 00:21:07.501 "data_offset": 0, 00:21:07.501 "data_size": 65536 00:21:07.501 }, 00:21:07.501 { 00:21:07.501 "name": "BaseBdev3", 00:21:07.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.501 "is_configured": false, 00:21:07.501 "data_offset": 0, 00:21:07.501 "data_size": 0 00:21:07.501 }, 00:21:07.501 { 00:21:07.501 "name": "BaseBdev4", 00:21:07.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.501 "is_configured": false, 00:21:07.501 "data_offset": 0, 00:21:07.501 "data_size": 0 00:21:07.501 } 00:21:07.501 ] 00:21:07.501 }' 00:21:07.501 13:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:07.501 13:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:08.070 13:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:08.329 [2024-07-12 13:46:56.752877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:08.329 BaseBdev3 00:21:08.329 13:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:08.329 13:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:08.329 13:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:08.329 13:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:08.329 13:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:08.329 13:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:08.329 13:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:08.588 13:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:08.848 [ 00:21:08.848 { 00:21:08.848 "name": "BaseBdev3", 00:21:08.848 "aliases": [ 00:21:08.848 "b3d5fda7-afd8-4bde-8785-99f652e7679f" 00:21:08.848 ], 00:21:08.848 "product_name": "Malloc disk", 00:21:08.848 "block_size": 512, 00:21:08.848 "num_blocks": 65536, 00:21:08.848 "uuid": "b3d5fda7-afd8-4bde-8785-99f652e7679f", 00:21:08.848 "assigned_rate_limits": { 00:21:08.848 "rw_ios_per_sec": 0, 00:21:08.848 "rw_mbytes_per_sec": 0, 00:21:08.848 "r_mbytes_per_sec": 0, 00:21:08.848 "w_mbytes_per_sec": 0 00:21:08.848 }, 00:21:08.848 "claimed": true, 00:21:08.848 "claim_type": "exclusive_write", 00:21:08.848 "zoned": false, 00:21:08.848 "supported_io_types": { 00:21:08.848 "read": true, 00:21:08.848 "write": true, 00:21:08.848 "unmap": true, 00:21:08.848 "flush": true, 00:21:08.848 "reset": true, 00:21:08.848 "nvme_admin": false, 00:21:08.848 "nvme_io": false, 00:21:08.848 "nvme_io_md": false, 00:21:08.848 "write_zeroes": true, 00:21:08.848 "zcopy": true, 00:21:08.848 "get_zone_info": false, 00:21:08.848 "zone_management": false, 00:21:08.848 "zone_append": false, 00:21:08.848 "compare": false, 00:21:08.848 "compare_and_write": false, 00:21:08.848 "abort": true, 00:21:08.848 "seek_hole": false, 00:21:08.848 "seek_data": false, 00:21:08.848 "copy": true, 00:21:08.848 "nvme_iov_md": false 00:21:08.848 }, 00:21:08.848 "memory_domains": [ 00:21:08.848 { 00:21:08.848 "dma_device_id": "system", 00:21:08.848 "dma_device_type": 1 00:21:08.848 }, 00:21:08.848 { 00:21:08.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.848 "dma_device_type": 2 00:21:08.848 } 00:21:08.848 ], 00:21:08.848 "driver_specific": {} 00:21:08.848 } 00:21:08.848 ] 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.848 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:09.107 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:09.107 "name": "Existed_Raid", 00:21:09.107 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:09.107 "strip_size_kb": 0, 00:21:09.107 "state": "configuring", 00:21:09.107 "raid_level": "raid1", 00:21:09.107 "superblock": false, 00:21:09.107 "num_base_bdevs": 4, 00:21:09.107 "num_base_bdevs_discovered": 3, 00:21:09.107 "num_base_bdevs_operational": 4, 00:21:09.107 "base_bdevs_list": [ 00:21:09.107 { 00:21:09.107 "name": "BaseBdev1", 00:21:09.107 "uuid": "0f3fd757-670c-43c5-9bad-d3490bb7f62c", 00:21:09.107 "is_configured": true, 00:21:09.107 "data_offset": 0, 00:21:09.107 "data_size": 65536 00:21:09.107 }, 00:21:09.107 { 00:21:09.107 "name": "BaseBdev2", 00:21:09.107 "uuid": "a774d576-55da-40c6-a191-778df05b5f30", 00:21:09.107 "is_configured": true, 00:21:09.107 "data_offset": 0, 00:21:09.107 "data_size": 65536 00:21:09.107 }, 00:21:09.107 { 00:21:09.107 "name": "BaseBdev3", 00:21:09.107 "uuid": "b3d5fda7-afd8-4bde-8785-99f652e7679f", 00:21:09.107 "is_configured": true, 00:21:09.107 "data_offset": 0, 00:21:09.107 "data_size": 65536 00:21:09.107 }, 00:21:09.107 { 00:21:09.107 "name": "BaseBdev4", 00:21:09.107 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:09.107 "is_configured": false, 00:21:09.107 "data_offset": 0, 00:21:09.107 "data_size": 0 00:21:09.107 } 00:21:09.107 ] 00:21:09.107 }' 00:21:09.107 13:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:09.107 13:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:09.675 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:09.935 [2024-07-12 13:46:58.364538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:09.935 [2024-07-12 13:46:58.364581] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1025c40 00:21:09.935 [2024-07-12 13:46:58.364589] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:09.935 [2024-07-12 13:46:58.364801] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10268c0 00:21:09.935 [2024-07-12 13:46:58.364924] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1025c40 00:21:09.935 [2024-07-12 13:46:58.364942] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1025c40 00:21:09.935 [2024-07-12 13:46:58.365106] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:09.935 BaseBdev4 00:21:09.935 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:09.935 13:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:09.935 13:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:09.935 13:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:09.935 13:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:09.935 13:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:09.935 13:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:10.195 13:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:10.454 [ 00:21:10.454 { 00:21:10.454 "name": "BaseBdev4", 00:21:10.454 "aliases": [ 00:21:10.454 "61801b3c-d446-4205-bdc2-dbbf3a739da7" 00:21:10.454 ], 00:21:10.454 "product_name": "Malloc disk", 00:21:10.454 "block_size": 512, 00:21:10.454 "num_blocks": 65536, 00:21:10.454 "uuid": "61801b3c-d446-4205-bdc2-dbbf3a739da7", 00:21:10.454 "assigned_rate_limits": { 00:21:10.454 "rw_ios_per_sec": 0, 00:21:10.454 "rw_mbytes_per_sec": 0, 00:21:10.454 "r_mbytes_per_sec": 0, 00:21:10.454 "w_mbytes_per_sec": 0 00:21:10.454 }, 00:21:10.454 "claimed": true, 00:21:10.454 "claim_type": "exclusive_write", 00:21:10.454 "zoned": false, 00:21:10.454 "supported_io_types": { 00:21:10.454 "read": true, 00:21:10.454 "write": true, 00:21:10.454 "unmap": true, 00:21:10.454 "flush": true, 00:21:10.454 "reset": true, 00:21:10.454 "nvme_admin": false, 00:21:10.454 "nvme_io": false, 00:21:10.454 "nvme_io_md": false, 00:21:10.454 "write_zeroes": true, 00:21:10.454 "zcopy": true, 00:21:10.454 "get_zone_info": false, 00:21:10.454 "zone_management": false, 00:21:10.454 "zone_append": false, 00:21:10.454 "compare": false, 00:21:10.454 "compare_and_write": false, 00:21:10.454 "abort": true, 00:21:10.454 "seek_hole": false, 00:21:10.454 "seek_data": false, 00:21:10.454 "copy": true, 00:21:10.454 "nvme_iov_md": false 00:21:10.454 }, 00:21:10.454 "memory_domains": [ 00:21:10.455 { 00:21:10.455 "dma_device_id": "system", 00:21:10.455 "dma_device_type": 1 00:21:10.455 }, 00:21:10.455 { 00:21:10.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:10.455 "dma_device_type": 2 00:21:10.455 } 00:21:10.455 ], 00:21:10.455 "driver_specific": {} 00:21:10.455 } 00:21:10.455 ] 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.455 13:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:10.714 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.714 "name": "Existed_Raid", 00:21:10.714 "uuid": "490158bf-6cbf-4d81-8db8-5f77ce3c7294", 00:21:10.714 "strip_size_kb": 0, 00:21:10.714 "state": "online", 00:21:10.714 "raid_level": "raid1", 00:21:10.714 "superblock": false, 00:21:10.714 "num_base_bdevs": 4, 00:21:10.714 "num_base_bdevs_discovered": 4, 00:21:10.714 "num_base_bdevs_operational": 4, 00:21:10.714 "base_bdevs_list": [ 00:21:10.714 { 00:21:10.714 "name": "BaseBdev1", 00:21:10.714 "uuid": "0f3fd757-670c-43c5-9bad-d3490bb7f62c", 00:21:10.714 "is_configured": true, 00:21:10.714 "data_offset": 0, 00:21:10.714 "data_size": 65536 00:21:10.714 }, 00:21:10.714 { 00:21:10.714 "name": "BaseBdev2", 00:21:10.714 "uuid": "a774d576-55da-40c6-a191-778df05b5f30", 00:21:10.714 "is_configured": true, 00:21:10.714 "data_offset": 0, 00:21:10.714 "data_size": 65536 00:21:10.714 }, 00:21:10.714 { 00:21:10.714 "name": "BaseBdev3", 00:21:10.714 "uuid": "b3d5fda7-afd8-4bde-8785-99f652e7679f", 00:21:10.714 "is_configured": true, 00:21:10.714 "data_offset": 0, 00:21:10.714 "data_size": 65536 00:21:10.714 }, 00:21:10.714 { 00:21:10.714 "name": "BaseBdev4", 00:21:10.714 "uuid": "61801b3c-d446-4205-bdc2-dbbf3a739da7", 00:21:10.714 "is_configured": true, 00:21:10.714 "data_offset": 0, 00:21:10.714 "data_size": 65536 00:21:10.714 } 00:21:10.714 ] 00:21:10.714 }' 00:21:10.714 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.714 13:46:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:11.282 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:11.282 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:11.282 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:11.282 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:11.282 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:11.282 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:11.282 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:11.282 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:11.542 [2024-07-12 13:46:59.941074] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:11.542 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:11.542 "name": "Existed_Raid", 00:21:11.542 "aliases": [ 00:21:11.542 "490158bf-6cbf-4d81-8db8-5f77ce3c7294" 00:21:11.542 ], 00:21:11.542 "product_name": "Raid Volume", 00:21:11.542 "block_size": 512, 00:21:11.542 "num_blocks": 65536, 00:21:11.542 "uuid": "490158bf-6cbf-4d81-8db8-5f77ce3c7294", 00:21:11.542 "assigned_rate_limits": { 00:21:11.542 "rw_ios_per_sec": 0, 00:21:11.542 "rw_mbytes_per_sec": 0, 00:21:11.542 "r_mbytes_per_sec": 0, 00:21:11.542 "w_mbytes_per_sec": 0 00:21:11.542 }, 00:21:11.542 "claimed": false, 00:21:11.542 "zoned": false, 00:21:11.542 "supported_io_types": { 00:21:11.542 "read": true, 00:21:11.542 "write": true, 00:21:11.542 "unmap": false, 00:21:11.542 "flush": false, 00:21:11.542 "reset": true, 00:21:11.542 "nvme_admin": false, 00:21:11.542 "nvme_io": false, 00:21:11.542 "nvme_io_md": false, 00:21:11.542 "write_zeroes": true, 00:21:11.542 "zcopy": false, 00:21:11.542 "get_zone_info": false, 00:21:11.542 "zone_management": false, 00:21:11.542 "zone_append": false, 00:21:11.542 "compare": false, 00:21:11.542 "compare_and_write": false, 00:21:11.542 "abort": false, 00:21:11.542 "seek_hole": false, 00:21:11.542 "seek_data": false, 00:21:11.542 "copy": false, 00:21:11.542 "nvme_iov_md": false 00:21:11.542 }, 00:21:11.542 "memory_domains": [ 00:21:11.542 { 00:21:11.542 "dma_device_id": "system", 00:21:11.542 "dma_device_type": 1 00:21:11.542 }, 00:21:11.542 { 00:21:11.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.542 "dma_device_type": 2 00:21:11.542 }, 00:21:11.542 { 00:21:11.542 "dma_device_id": "system", 00:21:11.542 "dma_device_type": 1 00:21:11.542 }, 00:21:11.542 { 00:21:11.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.542 "dma_device_type": 2 00:21:11.542 }, 00:21:11.542 { 00:21:11.542 "dma_device_id": "system", 00:21:11.542 "dma_device_type": 1 00:21:11.542 }, 00:21:11.542 { 00:21:11.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.542 "dma_device_type": 2 00:21:11.542 }, 00:21:11.542 { 00:21:11.542 "dma_device_id": "system", 00:21:11.542 "dma_device_type": 1 00:21:11.542 }, 00:21:11.542 { 00:21:11.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.542 "dma_device_type": 2 00:21:11.542 } 00:21:11.542 ], 00:21:11.542 "driver_specific": { 00:21:11.542 "raid": { 00:21:11.542 "uuid": "490158bf-6cbf-4d81-8db8-5f77ce3c7294", 00:21:11.542 "strip_size_kb": 0, 00:21:11.542 "state": "online", 00:21:11.542 "raid_level": "raid1", 00:21:11.542 "superblock": false, 00:21:11.542 "num_base_bdevs": 4, 00:21:11.542 "num_base_bdevs_discovered": 4, 00:21:11.542 "num_base_bdevs_operational": 4, 00:21:11.542 "base_bdevs_list": [ 00:21:11.542 { 00:21:11.542 "name": "BaseBdev1", 00:21:11.542 "uuid": "0f3fd757-670c-43c5-9bad-d3490bb7f62c", 00:21:11.542 "is_configured": true, 00:21:11.542 "data_offset": 0, 00:21:11.542 "data_size": 65536 00:21:11.542 }, 00:21:11.542 { 00:21:11.542 "name": "BaseBdev2", 00:21:11.542 "uuid": "a774d576-55da-40c6-a191-778df05b5f30", 00:21:11.542 "is_configured": true, 00:21:11.542 "data_offset": 0, 00:21:11.542 "data_size": 65536 00:21:11.542 }, 00:21:11.542 { 00:21:11.542 "name": "BaseBdev3", 00:21:11.542 "uuid": "b3d5fda7-afd8-4bde-8785-99f652e7679f", 00:21:11.542 "is_configured": true, 00:21:11.542 "data_offset": 0, 00:21:11.542 "data_size": 65536 00:21:11.542 }, 00:21:11.542 { 00:21:11.542 "name": "BaseBdev4", 00:21:11.542 "uuid": "61801b3c-d446-4205-bdc2-dbbf3a739da7", 00:21:11.542 "is_configured": true, 00:21:11.542 "data_offset": 0, 00:21:11.542 "data_size": 65536 00:21:11.542 } 00:21:11.542 ] 00:21:11.542 } 00:21:11.542 } 00:21:11.542 }' 00:21:11.542 13:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:11.542 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:11.542 BaseBdev2 00:21:11.542 BaseBdev3 00:21:11.542 BaseBdev4' 00:21:11.542 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:11.542 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:11.542 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:11.801 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:11.801 "name": "BaseBdev1", 00:21:11.801 "aliases": [ 00:21:11.801 "0f3fd757-670c-43c5-9bad-d3490bb7f62c" 00:21:11.802 ], 00:21:11.802 "product_name": "Malloc disk", 00:21:11.802 "block_size": 512, 00:21:11.802 "num_blocks": 65536, 00:21:11.802 "uuid": "0f3fd757-670c-43c5-9bad-d3490bb7f62c", 00:21:11.802 "assigned_rate_limits": { 00:21:11.802 "rw_ios_per_sec": 0, 00:21:11.802 "rw_mbytes_per_sec": 0, 00:21:11.802 "r_mbytes_per_sec": 0, 00:21:11.802 "w_mbytes_per_sec": 0 00:21:11.802 }, 00:21:11.802 "claimed": true, 00:21:11.802 "claim_type": "exclusive_write", 00:21:11.802 "zoned": false, 00:21:11.802 "supported_io_types": { 00:21:11.802 "read": true, 00:21:11.802 "write": true, 00:21:11.802 "unmap": true, 00:21:11.802 "flush": true, 00:21:11.802 "reset": true, 00:21:11.802 "nvme_admin": false, 00:21:11.802 "nvme_io": false, 00:21:11.802 "nvme_io_md": false, 00:21:11.802 "write_zeroes": true, 00:21:11.802 "zcopy": true, 00:21:11.802 "get_zone_info": false, 00:21:11.802 "zone_management": false, 00:21:11.802 "zone_append": false, 00:21:11.802 "compare": false, 00:21:11.802 "compare_and_write": false, 00:21:11.802 "abort": true, 00:21:11.802 "seek_hole": false, 00:21:11.802 "seek_data": false, 00:21:11.802 "copy": true, 00:21:11.802 "nvme_iov_md": false 00:21:11.802 }, 00:21:11.802 "memory_domains": [ 00:21:11.802 { 00:21:11.802 "dma_device_id": "system", 00:21:11.802 "dma_device_type": 1 00:21:11.802 }, 00:21:11.802 { 00:21:11.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.802 "dma_device_type": 2 00:21:11.802 } 00:21:11.802 ], 00:21:11.802 "driver_specific": {} 00:21:11.802 }' 00:21:11.802 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:11.802 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:11.802 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:11.802 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:12.061 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:12.320 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:12.320 "name": "BaseBdev2", 00:21:12.320 "aliases": [ 00:21:12.320 "a774d576-55da-40c6-a191-778df05b5f30" 00:21:12.320 ], 00:21:12.320 "product_name": "Malloc disk", 00:21:12.320 "block_size": 512, 00:21:12.320 "num_blocks": 65536, 00:21:12.320 "uuid": "a774d576-55da-40c6-a191-778df05b5f30", 00:21:12.320 "assigned_rate_limits": { 00:21:12.320 "rw_ios_per_sec": 0, 00:21:12.320 "rw_mbytes_per_sec": 0, 00:21:12.320 "r_mbytes_per_sec": 0, 00:21:12.320 "w_mbytes_per_sec": 0 00:21:12.320 }, 00:21:12.320 "claimed": true, 00:21:12.320 "claim_type": "exclusive_write", 00:21:12.320 "zoned": false, 00:21:12.320 "supported_io_types": { 00:21:12.320 "read": true, 00:21:12.320 "write": true, 00:21:12.320 "unmap": true, 00:21:12.320 "flush": true, 00:21:12.320 "reset": true, 00:21:12.320 "nvme_admin": false, 00:21:12.320 "nvme_io": false, 00:21:12.320 "nvme_io_md": false, 00:21:12.320 "write_zeroes": true, 00:21:12.320 "zcopy": true, 00:21:12.320 "get_zone_info": false, 00:21:12.320 "zone_management": false, 00:21:12.320 "zone_append": false, 00:21:12.320 "compare": false, 00:21:12.320 "compare_and_write": false, 00:21:12.320 "abort": true, 00:21:12.320 "seek_hole": false, 00:21:12.320 "seek_data": false, 00:21:12.320 "copy": true, 00:21:12.320 "nvme_iov_md": false 00:21:12.320 }, 00:21:12.320 "memory_domains": [ 00:21:12.320 { 00:21:12.320 "dma_device_id": "system", 00:21:12.320 "dma_device_type": 1 00:21:12.320 }, 00:21:12.320 { 00:21:12.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.320 "dma_device_type": 2 00:21:12.320 } 00:21:12.320 ], 00:21:12.320 "driver_specific": {} 00:21:12.320 }' 00:21:12.320 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:12.579 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:12.579 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:12.579 13:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.579 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.579 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:12.579 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.579 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.838 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:12.838 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.838 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.838 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:12.838 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:12.838 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:12.838 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:13.098 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:13.098 "name": "BaseBdev3", 00:21:13.098 "aliases": [ 00:21:13.098 "b3d5fda7-afd8-4bde-8785-99f652e7679f" 00:21:13.098 ], 00:21:13.098 "product_name": "Malloc disk", 00:21:13.098 "block_size": 512, 00:21:13.098 "num_blocks": 65536, 00:21:13.098 "uuid": "b3d5fda7-afd8-4bde-8785-99f652e7679f", 00:21:13.098 "assigned_rate_limits": { 00:21:13.098 "rw_ios_per_sec": 0, 00:21:13.098 "rw_mbytes_per_sec": 0, 00:21:13.098 "r_mbytes_per_sec": 0, 00:21:13.098 "w_mbytes_per_sec": 0 00:21:13.098 }, 00:21:13.098 "claimed": true, 00:21:13.098 "claim_type": "exclusive_write", 00:21:13.098 "zoned": false, 00:21:13.098 "supported_io_types": { 00:21:13.098 "read": true, 00:21:13.098 "write": true, 00:21:13.098 "unmap": true, 00:21:13.098 "flush": true, 00:21:13.098 "reset": true, 00:21:13.098 "nvme_admin": false, 00:21:13.098 "nvme_io": false, 00:21:13.098 "nvme_io_md": false, 00:21:13.098 "write_zeroes": true, 00:21:13.098 "zcopy": true, 00:21:13.098 "get_zone_info": false, 00:21:13.098 "zone_management": false, 00:21:13.098 "zone_append": false, 00:21:13.098 "compare": false, 00:21:13.098 "compare_and_write": false, 00:21:13.098 "abort": true, 00:21:13.098 "seek_hole": false, 00:21:13.098 "seek_data": false, 00:21:13.098 "copy": true, 00:21:13.098 "nvme_iov_md": false 00:21:13.098 }, 00:21:13.098 "memory_domains": [ 00:21:13.098 { 00:21:13.098 "dma_device_id": "system", 00:21:13.098 "dma_device_type": 1 00:21:13.098 }, 00:21:13.098 { 00:21:13.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.098 "dma_device_type": 2 00:21:13.098 } 00:21:13.098 ], 00:21:13.098 "driver_specific": {} 00:21:13.098 }' 00:21:13.098 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.098 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.098 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:13.098 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.357 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.357 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:13.357 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.357 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.357 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:13.357 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.357 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.358 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:13.358 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:13.358 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:13.358 13:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:13.617 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:13.617 "name": "BaseBdev4", 00:21:13.617 "aliases": [ 00:21:13.617 "61801b3c-d446-4205-bdc2-dbbf3a739da7" 00:21:13.617 ], 00:21:13.617 "product_name": "Malloc disk", 00:21:13.617 "block_size": 512, 00:21:13.617 "num_blocks": 65536, 00:21:13.617 "uuid": "61801b3c-d446-4205-bdc2-dbbf3a739da7", 00:21:13.617 "assigned_rate_limits": { 00:21:13.617 "rw_ios_per_sec": 0, 00:21:13.617 "rw_mbytes_per_sec": 0, 00:21:13.617 "r_mbytes_per_sec": 0, 00:21:13.617 "w_mbytes_per_sec": 0 00:21:13.617 }, 00:21:13.617 "claimed": true, 00:21:13.617 "claim_type": "exclusive_write", 00:21:13.617 "zoned": false, 00:21:13.617 "supported_io_types": { 00:21:13.617 "read": true, 00:21:13.617 "write": true, 00:21:13.617 "unmap": true, 00:21:13.617 "flush": true, 00:21:13.617 "reset": true, 00:21:13.617 "nvme_admin": false, 00:21:13.617 "nvme_io": false, 00:21:13.617 "nvme_io_md": false, 00:21:13.617 "write_zeroes": true, 00:21:13.617 "zcopy": true, 00:21:13.617 "get_zone_info": false, 00:21:13.617 "zone_management": false, 00:21:13.617 "zone_append": false, 00:21:13.617 "compare": false, 00:21:13.617 "compare_and_write": false, 00:21:13.617 "abort": true, 00:21:13.617 "seek_hole": false, 00:21:13.617 "seek_data": false, 00:21:13.617 "copy": true, 00:21:13.617 "nvme_iov_md": false 00:21:13.617 }, 00:21:13.617 "memory_domains": [ 00:21:13.617 { 00:21:13.617 "dma_device_id": "system", 00:21:13.617 "dma_device_type": 1 00:21:13.617 }, 00:21:13.617 { 00:21:13.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.617 "dma_device_type": 2 00:21:13.617 } 00:21:13.617 ], 00:21:13.617 "driver_specific": {} 00:21:13.617 }' 00:21:13.617 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.617 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:13.875 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:14.134 [2024-07-12 13:47:02.660227] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.134 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:14.393 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.393 "name": "Existed_Raid", 00:21:14.393 "uuid": "490158bf-6cbf-4d81-8db8-5f77ce3c7294", 00:21:14.393 "strip_size_kb": 0, 00:21:14.393 "state": "online", 00:21:14.393 "raid_level": "raid1", 00:21:14.393 "superblock": false, 00:21:14.393 "num_base_bdevs": 4, 00:21:14.393 "num_base_bdevs_discovered": 3, 00:21:14.393 "num_base_bdevs_operational": 3, 00:21:14.393 "base_bdevs_list": [ 00:21:14.393 { 00:21:14.393 "name": null, 00:21:14.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.393 "is_configured": false, 00:21:14.393 "data_offset": 0, 00:21:14.393 "data_size": 65536 00:21:14.393 }, 00:21:14.393 { 00:21:14.393 "name": "BaseBdev2", 00:21:14.393 "uuid": "a774d576-55da-40c6-a191-778df05b5f30", 00:21:14.393 "is_configured": true, 00:21:14.393 "data_offset": 0, 00:21:14.393 "data_size": 65536 00:21:14.393 }, 00:21:14.393 { 00:21:14.393 "name": "BaseBdev3", 00:21:14.393 "uuid": "b3d5fda7-afd8-4bde-8785-99f652e7679f", 00:21:14.393 "is_configured": true, 00:21:14.393 "data_offset": 0, 00:21:14.393 "data_size": 65536 00:21:14.393 }, 00:21:14.393 { 00:21:14.393 "name": "BaseBdev4", 00:21:14.393 "uuid": "61801b3c-d446-4205-bdc2-dbbf3a739da7", 00:21:14.393 "is_configured": true, 00:21:14.393 "data_offset": 0, 00:21:14.393 "data_size": 65536 00:21:14.393 } 00:21:14.393 ] 00:21:14.393 }' 00:21:14.393 13:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.394 13:47:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:14.960 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:14.960 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:14.960 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.960 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:15.219 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:15.219 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:15.219 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:15.478 [2024-07-12 13:47:03.948625] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:15.478 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:15.478 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:15.478 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.478 13:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:15.737 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:15.737 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:15.737 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:15.995 [2024-07-12 13:47:04.450363] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:15.995 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:15.995 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:15.995 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.995 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:16.252 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:16.252 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:16.252 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:16.511 [2024-07-12 13:47:04.952065] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:16.511 [2024-07-12 13:47:04.952152] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:16.511 [2024-07-12 13:47:04.963131] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:16.511 [2024-07-12 13:47:04.963166] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:16.511 [2024-07-12 13:47:04.963177] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1025c40 name Existed_Raid, state offline 00:21:16.511 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:16.511 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:16.511 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.511 13:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:16.770 13:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:16.770 13:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:16.770 13:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:16.770 13:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:16.770 13:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:16.770 13:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:17.028 BaseBdev2 00:21:17.028 13:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:17.028 13:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:17.028 13:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:17.028 13:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:17.028 13:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:17.028 13:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:17.028 13:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:17.288 13:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:17.547 [ 00:21:17.547 { 00:21:17.547 "name": "BaseBdev2", 00:21:17.547 "aliases": [ 00:21:17.547 "73b96b56-8d94-49bc-bbc8-b84d3bd43552" 00:21:17.547 ], 00:21:17.547 "product_name": "Malloc disk", 00:21:17.547 "block_size": 512, 00:21:17.547 "num_blocks": 65536, 00:21:17.547 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:17.547 "assigned_rate_limits": { 00:21:17.547 "rw_ios_per_sec": 0, 00:21:17.547 "rw_mbytes_per_sec": 0, 00:21:17.547 "r_mbytes_per_sec": 0, 00:21:17.547 "w_mbytes_per_sec": 0 00:21:17.547 }, 00:21:17.547 "claimed": false, 00:21:17.547 "zoned": false, 00:21:17.547 "supported_io_types": { 00:21:17.547 "read": true, 00:21:17.547 "write": true, 00:21:17.547 "unmap": true, 00:21:17.547 "flush": true, 00:21:17.547 "reset": true, 00:21:17.547 "nvme_admin": false, 00:21:17.547 "nvme_io": false, 00:21:17.547 "nvme_io_md": false, 00:21:17.547 "write_zeroes": true, 00:21:17.547 "zcopy": true, 00:21:17.547 "get_zone_info": false, 00:21:17.547 "zone_management": false, 00:21:17.547 "zone_append": false, 00:21:17.547 "compare": false, 00:21:17.547 "compare_and_write": false, 00:21:17.547 "abort": true, 00:21:17.547 "seek_hole": false, 00:21:17.547 "seek_data": false, 00:21:17.547 "copy": true, 00:21:17.547 "nvme_iov_md": false 00:21:17.547 }, 00:21:17.547 "memory_domains": [ 00:21:17.547 { 00:21:17.547 "dma_device_id": "system", 00:21:17.547 "dma_device_type": 1 00:21:17.547 }, 00:21:17.547 { 00:21:17.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:17.547 "dma_device_type": 2 00:21:17.547 } 00:21:17.547 ], 00:21:17.547 "driver_specific": {} 00:21:17.547 } 00:21:17.547 ] 00:21:17.547 13:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:17.547 13:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:17.547 13:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:17.547 13:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:17.806 BaseBdev3 00:21:17.806 13:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:17.806 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:17.806 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:17.806 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:17.806 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:17.806 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:17.806 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:18.065 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:18.324 [ 00:21:18.324 { 00:21:18.324 "name": "BaseBdev3", 00:21:18.324 "aliases": [ 00:21:18.324 "8213e432-ef4f-4136-a3f3-3960feebb450" 00:21:18.324 ], 00:21:18.324 "product_name": "Malloc disk", 00:21:18.324 "block_size": 512, 00:21:18.324 "num_blocks": 65536, 00:21:18.324 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:18.324 "assigned_rate_limits": { 00:21:18.324 "rw_ios_per_sec": 0, 00:21:18.324 "rw_mbytes_per_sec": 0, 00:21:18.324 "r_mbytes_per_sec": 0, 00:21:18.324 "w_mbytes_per_sec": 0 00:21:18.324 }, 00:21:18.324 "claimed": false, 00:21:18.324 "zoned": false, 00:21:18.324 "supported_io_types": { 00:21:18.324 "read": true, 00:21:18.324 "write": true, 00:21:18.324 "unmap": true, 00:21:18.324 "flush": true, 00:21:18.324 "reset": true, 00:21:18.324 "nvme_admin": false, 00:21:18.324 "nvme_io": false, 00:21:18.324 "nvme_io_md": false, 00:21:18.324 "write_zeroes": true, 00:21:18.324 "zcopy": true, 00:21:18.324 "get_zone_info": false, 00:21:18.324 "zone_management": false, 00:21:18.324 "zone_append": false, 00:21:18.324 "compare": false, 00:21:18.324 "compare_and_write": false, 00:21:18.324 "abort": true, 00:21:18.324 "seek_hole": false, 00:21:18.324 "seek_data": false, 00:21:18.324 "copy": true, 00:21:18.324 "nvme_iov_md": false 00:21:18.324 }, 00:21:18.324 "memory_domains": [ 00:21:18.324 { 00:21:18.324 "dma_device_id": "system", 00:21:18.324 "dma_device_type": 1 00:21:18.324 }, 00:21:18.324 { 00:21:18.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.324 "dma_device_type": 2 00:21:18.324 } 00:21:18.324 ], 00:21:18.324 "driver_specific": {} 00:21:18.324 } 00:21:18.324 ] 00:21:18.324 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:18.324 13:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:18.324 13:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:18.324 13:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:18.324 BaseBdev4 00:21:18.583 13:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:18.583 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:18.583 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:18.583 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:18.583 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:18.583 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:18.583 13:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:18.583 13:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:18.842 [ 00:21:18.842 { 00:21:18.842 "name": "BaseBdev4", 00:21:18.842 "aliases": [ 00:21:18.842 "308ba754-8956-466f-83e3-dd81f3fb9d3b" 00:21:18.842 ], 00:21:18.842 "product_name": "Malloc disk", 00:21:18.842 "block_size": 512, 00:21:18.842 "num_blocks": 65536, 00:21:18.842 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:18.842 "assigned_rate_limits": { 00:21:18.842 "rw_ios_per_sec": 0, 00:21:18.842 "rw_mbytes_per_sec": 0, 00:21:18.842 "r_mbytes_per_sec": 0, 00:21:18.842 "w_mbytes_per_sec": 0 00:21:18.842 }, 00:21:18.842 "claimed": false, 00:21:18.842 "zoned": false, 00:21:18.842 "supported_io_types": { 00:21:18.842 "read": true, 00:21:18.842 "write": true, 00:21:18.842 "unmap": true, 00:21:18.842 "flush": true, 00:21:18.842 "reset": true, 00:21:18.842 "nvme_admin": false, 00:21:18.842 "nvme_io": false, 00:21:18.842 "nvme_io_md": false, 00:21:18.842 "write_zeroes": true, 00:21:18.842 "zcopy": true, 00:21:18.842 "get_zone_info": false, 00:21:18.842 "zone_management": false, 00:21:18.842 "zone_append": false, 00:21:18.842 "compare": false, 00:21:18.842 "compare_and_write": false, 00:21:18.842 "abort": true, 00:21:18.842 "seek_hole": false, 00:21:18.842 "seek_data": false, 00:21:18.842 "copy": true, 00:21:18.842 "nvme_iov_md": false 00:21:18.842 }, 00:21:18.842 "memory_domains": [ 00:21:18.842 { 00:21:18.842 "dma_device_id": "system", 00:21:18.842 "dma_device_type": 1 00:21:18.842 }, 00:21:18.842 { 00:21:18.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.842 "dma_device_type": 2 00:21:18.842 } 00:21:18.842 ], 00:21:18.842 "driver_specific": {} 00:21:18.842 } 00:21:18.842 ] 00:21:18.842 13:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:18.842 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:18.842 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:18.842 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:19.101 [2024-07-12 13:47:07.628620] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:19.101 [2024-07-12 13:47:07.628667] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:19.101 [2024-07-12 13:47:07.628688] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:19.101 [2024-07-12 13:47:07.630063] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:19.101 [2024-07-12 13:47:07.630107] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:19.101 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:19.101 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:19.101 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:19.102 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.102 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.102 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:19.102 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.102 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.102 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.102 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.102 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.102 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:19.360 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.360 "name": "Existed_Raid", 00:21:19.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:19.360 "strip_size_kb": 0, 00:21:19.360 "state": "configuring", 00:21:19.360 "raid_level": "raid1", 00:21:19.360 "superblock": false, 00:21:19.360 "num_base_bdevs": 4, 00:21:19.360 "num_base_bdevs_discovered": 3, 00:21:19.360 "num_base_bdevs_operational": 4, 00:21:19.360 "base_bdevs_list": [ 00:21:19.360 { 00:21:19.360 "name": "BaseBdev1", 00:21:19.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:19.361 "is_configured": false, 00:21:19.361 "data_offset": 0, 00:21:19.361 "data_size": 0 00:21:19.361 }, 00:21:19.361 { 00:21:19.361 "name": "BaseBdev2", 00:21:19.361 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:19.361 "is_configured": true, 00:21:19.361 "data_offset": 0, 00:21:19.361 "data_size": 65536 00:21:19.361 }, 00:21:19.361 { 00:21:19.361 "name": "BaseBdev3", 00:21:19.361 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:19.361 "is_configured": true, 00:21:19.361 "data_offset": 0, 00:21:19.361 "data_size": 65536 00:21:19.361 }, 00:21:19.361 { 00:21:19.361 "name": "BaseBdev4", 00:21:19.361 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:19.361 "is_configured": true, 00:21:19.361 "data_offset": 0, 00:21:19.361 "data_size": 65536 00:21:19.361 } 00:21:19.361 ] 00:21:19.361 }' 00:21:19.361 13:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.361 13:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:19.927 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:20.186 [2024-07-12 13:47:08.707452] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.186 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:20.445 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.445 "name": "Existed_Raid", 00:21:20.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.445 "strip_size_kb": 0, 00:21:20.445 "state": "configuring", 00:21:20.445 "raid_level": "raid1", 00:21:20.445 "superblock": false, 00:21:20.445 "num_base_bdevs": 4, 00:21:20.445 "num_base_bdevs_discovered": 2, 00:21:20.445 "num_base_bdevs_operational": 4, 00:21:20.445 "base_bdevs_list": [ 00:21:20.445 { 00:21:20.445 "name": "BaseBdev1", 00:21:20.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.445 "is_configured": false, 00:21:20.445 "data_offset": 0, 00:21:20.446 "data_size": 0 00:21:20.446 }, 00:21:20.446 { 00:21:20.446 "name": null, 00:21:20.446 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:20.446 "is_configured": false, 00:21:20.446 "data_offset": 0, 00:21:20.446 "data_size": 65536 00:21:20.446 }, 00:21:20.446 { 00:21:20.446 "name": "BaseBdev3", 00:21:20.446 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:20.446 "is_configured": true, 00:21:20.446 "data_offset": 0, 00:21:20.446 "data_size": 65536 00:21:20.446 }, 00:21:20.446 { 00:21:20.446 "name": "BaseBdev4", 00:21:20.446 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:20.446 "is_configured": true, 00:21:20.446 "data_offset": 0, 00:21:20.446 "data_size": 65536 00:21:20.446 } 00:21:20.446 ] 00:21:20.446 }' 00:21:20.446 13:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.446 13:47:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:21.013 13:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.013 13:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:21.272 13:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:21.272 13:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:21.531 [2024-07-12 13:47:10.058399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:21.531 BaseBdev1 00:21:21.531 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:21.531 13:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:21.531 13:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:21.531 13:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:21.531 13:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:21.531 13:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:21.531 13:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:21.790 13:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:22.122 [ 00:21:22.122 { 00:21:22.122 "name": "BaseBdev1", 00:21:22.122 "aliases": [ 00:21:22.122 "eb7ac3f6-360f-4340-89cd-67f8dab1b871" 00:21:22.122 ], 00:21:22.122 "product_name": "Malloc disk", 00:21:22.122 "block_size": 512, 00:21:22.122 "num_blocks": 65536, 00:21:22.122 "uuid": "eb7ac3f6-360f-4340-89cd-67f8dab1b871", 00:21:22.122 "assigned_rate_limits": { 00:21:22.122 "rw_ios_per_sec": 0, 00:21:22.122 "rw_mbytes_per_sec": 0, 00:21:22.122 "r_mbytes_per_sec": 0, 00:21:22.122 "w_mbytes_per_sec": 0 00:21:22.122 }, 00:21:22.122 "claimed": true, 00:21:22.122 "claim_type": "exclusive_write", 00:21:22.122 "zoned": false, 00:21:22.122 "supported_io_types": { 00:21:22.122 "read": true, 00:21:22.122 "write": true, 00:21:22.122 "unmap": true, 00:21:22.122 "flush": true, 00:21:22.122 "reset": true, 00:21:22.122 "nvme_admin": false, 00:21:22.122 "nvme_io": false, 00:21:22.122 "nvme_io_md": false, 00:21:22.122 "write_zeroes": true, 00:21:22.122 "zcopy": true, 00:21:22.122 "get_zone_info": false, 00:21:22.122 "zone_management": false, 00:21:22.122 "zone_append": false, 00:21:22.122 "compare": false, 00:21:22.122 "compare_and_write": false, 00:21:22.122 "abort": true, 00:21:22.122 "seek_hole": false, 00:21:22.122 "seek_data": false, 00:21:22.122 "copy": true, 00:21:22.122 "nvme_iov_md": false 00:21:22.122 }, 00:21:22.122 "memory_domains": [ 00:21:22.122 { 00:21:22.122 "dma_device_id": "system", 00:21:22.122 "dma_device_type": 1 00:21:22.122 }, 00:21:22.122 { 00:21:22.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.122 "dma_device_type": 2 00:21:22.122 } 00:21:22.122 ], 00:21:22.122 "driver_specific": {} 00:21:22.122 } 00:21:22.122 ] 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.122 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:22.405 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.405 "name": "Existed_Raid", 00:21:22.405 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.405 "strip_size_kb": 0, 00:21:22.405 "state": "configuring", 00:21:22.405 "raid_level": "raid1", 00:21:22.405 "superblock": false, 00:21:22.405 "num_base_bdevs": 4, 00:21:22.405 "num_base_bdevs_discovered": 3, 00:21:22.405 "num_base_bdevs_operational": 4, 00:21:22.405 "base_bdevs_list": [ 00:21:22.405 { 00:21:22.405 "name": "BaseBdev1", 00:21:22.405 "uuid": "eb7ac3f6-360f-4340-89cd-67f8dab1b871", 00:21:22.405 "is_configured": true, 00:21:22.405 "data_offset": 0, 00:21:22.405 "data_size": 65536 00:21:22.405 }, 00:21:22.405 { 00:21:22.405 "name": null, 00:21:22.405 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:22.405 "is_configured": false, 00:21:22.405 "data_offset": 0, 00:21:22.405 "data_size": 65536 00:21:22.405 }, 00:21:22.405 { 00:21:22.405 "name": "BaseBdev3", 00:21:22.405 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:22.405 "is_configured": true, 00:21:22.405 "data_offset": 0, 00:21:22.405 "data_size": 65536 00:21:22.405 }, 00:21:22.405 { 00:21:22.405 "name": "BaseBdev4", 00:21:22.405 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:22.405 "is_configured": true, 00:21:22.405 "data_offset": 0, 00:21:22.405 "data_size": 65536 00:21:22.405 } 00:21:22.405 ] 00:21:22.405 }' 00:21:22.405 13:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.405 13:47:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:23.419 13:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.419 13:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:23.679 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:23.679 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:23.679 [2024-07-12 13:47:12.240238] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:23.679 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:23.679 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:23.679 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:23.679 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:23.679 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:23.679 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:23.679 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:23.679 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:23.938 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:23.938 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:23.938 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.938 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:23.938 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:23.938 "name": "Existed_Raid", 00:21:23.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:23.938 "strip_size_kb": 0, 00:21:23.938 "state": "configuring", 00:21:23.938 "raid_level": "raid1", 00:21:23.938 "superblock": false, 00:21:23.938 "num_base_bdevs": 4, 00:21:23.938 "num_base_bdevs_discovered": 2, 00:21:23.938 "num_base_bdevs_operational": 4, 00:21:23.938 "base_bdevs_list": [ 00:21:23.938 { 00:21:23.938 "name": "BaseBdev1", 00:21:23.938 "uuid": "eb7ac3f6-360f-4340-89cd-67f8dab1b871", 00:21:23.938 "is_configured": true, 00:21:23.938 "data_offset": 0, 00:21:23.938 "data_size": 65536 00:21:23.938 }, 00:21:23.938 { 00:21:23.938 "name": null, 00:21:23.938 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:23.938 "is_configured": false, 00:21:23.938 "data_offset": 0, 00:21:23.938 "data_size": 65536 00:21:23.938 }, 00:21:23.938 { 00:21:23.938 "name": null, 00:21:23.938 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:23.938 "is_configured": false, 00:21:23.938 "data_offset": 0, 00:21:23.938 "data_size": 65536 00:21:23.938 }, 00:21:23.938 { 00:21:23.939 "name": "BaseBdev4", 00:21:23.939 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:23.939 "is_configured": true, 00:21:23.939 "data_offset": 0, 00:21:23.939 "data_size": 65536 00:21:23.939 } 00:21:23.939 ] 00:21:23.939 }' 00:21:23.939 13:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:23.939 13:47:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:24.878 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.878 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:24.878 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:24.878 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:25.137 [2024-07-12 13:47:13.611881] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:25.137 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.138 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:25.397 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:25.397 "name": "Existed_Raid", 00:21:25.397 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.397 "strip_size_kb": 0, 00:21:25.397 "state": "configuring", 00:21:25.397 "raid_level": "raid1", 00:21:25.397 "superblock": false, 00:21:25.397 "num_base_bdevs": 4, 00:21:25.397 "num_base_bdevs_discovered": 3, 00:21:25.397 "num_base_bdevs_operational": 4, 00:21:25.397 "base_bdevs_list": [ 00:21:25.397 { 00:21:25.397 "name": "BaseBdev1", 00:21:25.397 "uuid": "eb7ac3f6-360f-4340-89cd-67f8dab1b871", 00:21:25.397 "is_configured": true, 00:21:25.397 "data_offset": 0, 00:21:25.397 "data_size": 65536 00:21:25.397 }, 00:21:25.397 { 00:21:25.397 "name": null, 00:21:25.397 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:25.397 "is_configured": false, 00:21:25.397 "data_offset": 0, 00:21:25.397 "data_size": 65536 00:21:25.397 }, 00:21:25.397 { 00:21:25.397 "name": "BaseBdev3", 00:21:25.397 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:25.397 "is_configured": true, 00:21:25.397 "data_offset": 0, 00:21:25.397 "data_size": 65536 00:21:25.397 }, 00:21:25.397 { 00:21:25.397 "name": "BaseBdev4", 00:21:25.397 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:25.397 "is_configured": true, 00:21:25.397 "data_offset": 0, 00:21:25.397 "data_size": 65536 00:21:25.397 } 00:21:25.397 ] 00:21:25.397 }' 00:21:25.397 13:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:25.397 13:47:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:25.965 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.965 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:26.224 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:26.224 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:26.483 [2024-07-12 13:47:14.843163] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:26.483 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:26.483 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:26.484 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:26.484 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:26.484 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:26.484 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:26.484 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:26.484 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:26.484 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:26.484 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:26.484 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.484 13:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:27.053 13:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.053 "name": "Existed_Raid", 00:21:27.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.053 "strip_size_kb": 0, 00:21:27.053 "state": "configuring", 00:21:27.053 "raid_level": "raid1", 00:21:27.053 "superblock": false, 00:21:27.053 "num_base_bdevs": 4, 00:21:27.053 "num_base_bdevs_discovered": 2, 00:21:27.053 "num_base_bdevs_operational": 4, 00:21:27.053 "base_bdevs_list": [ 00:21:27.053 { 00:21:27.053 "name": null, 00:21:27.053 "uuid": "eb7ac3f6-360f-4340-89cd-67f8dab1b871", 00:21:27.053 "is_configured": false, 00:21:27.053 "data_offset": 0, 00:21:27.053 "data_size": 65536 00:21:27.053 }, 00:21:27.053 { 00:21:27.053 "name": null, 00:21:27.053 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:27.053 "is_configured": false, 00:21:27.053 "data_offset": 0, 00:21:27.053 "data_size": 65536 00:21:27.053 }, 00:21:27.053 { 00:21:27.053 "name": "BaseBdev3", 00:21:27.053 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:27.053 "is_configured": true, 00:21:27.053 "data_offset": 0, 00:21:27.053 "data_size": 65536 00:21:27.053 }, 00:21:27.053 { 00:21:27.053 "name": "BaseBdev4", 00:21:27.053 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:27.053 "is_configured": true, 00:21:27.053 "data_offset": 0, 00:21:27.053 "data_size": 65536 00:21:27.053 } 00:21:27.053 ] 00:21:27.053 }' 00:21:27.053 13:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.053 13:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:27.989 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.989 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:27.989 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:27.989 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:28.248 [2024-07-12 13:47:16.798871] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.248 13:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:28.814 13:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:28.814 "name": "Existed_Raid", 00:21:28.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.814 "strip_size_kb": 0, 00:21:28.814 "state": "configuring", 00:21:28.814 "raid_level": "raid1", 00:21:28.814 "superblock": false, 00:21:28.814 "num_base_bdevs": 4, 00:21:28.814 "num_base_bdevs_discovered": 3, 00:21:28.814 "num_base_bdevs_operational": 4, 00:21:28.814 "base_bdevs_list": [ 00:21:28.814 { 00:21:28.814 "name": null, 00:21:28.814 "uuid": "eb7ac3f6-360f-4340-89cd-67f8dab1b871", 00:21:28.814 "is_configured": false, 00:21:28.814 "data_offset": 0, 00:21:28.814 "data_size": 65536 00:21:28.814 }, 00:21:28.814 { 00:21:28.814 "name": "BaseBdev2", 00:21:28.815 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:28.815 "is_configured": true, 00:21:28.815 "data_offset": 0, 00:21:28.815 "data_size": 65536 00:21:28.815 }, 00:21:28.815 { 00:21:28.815 "name": "BaseBdev3", 00:21:28.815 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:28.815 "is_configured": true, 00:21:28.815 "data_offset": 0, 00:21:28.815 "data_size": 65536 00:21:28.815 }, 00:21:28.815 { 00:21:28.815 "name": "BaseBdev4", 00:21:28.815 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:28.815 "is_configured": true, 00:21:28.815 "data_offset": 0, 00:21:28.815 "data_size": 65536 00:21:28.815 } 00:21:28.815 ] 00:21:28.815 }' 00:21:28.815 13:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:28.815 13:47:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:29.381 13:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.381 13:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:29.639 13:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:29.639 13:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.639 13:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:29.897 13:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u eb7ac3f6-360f-4340-89cd-67f8dab1b871 00:21:30.155 [2024-07-12 13:47:18.615265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:30.155 [2024-07-12 13:47:18.615307] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1029910 00:21:30.155 [2024-07-12 13:47:18.615315] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:30.155 [2024-07-12 13:47:18.615515] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1025320 00:21:30.155 [2024-07-12 13:47:18.615641] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1029910 00:21:30.155 [2024-07-12 13:47:18.615651] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1029910 00:21:30.155 [2024-07-12 13:47:18.615817] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:30.155 NewBaseBdev 00:21:30.155 13:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:30.155 13:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:30.155 13:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:30.156 13:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:30.156 13:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:30.156 13:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:30.156 13:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:30.414 13:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:30.414 [ 00:21:30.414 { 00:21:30.414 "name": "NewBaseBdev", 00:21:30.414 "aliases": [ 00:21:30.414 "eb7ac3f6-360f-4340-89cd-67f8dab1b871" 00:21:30.414 ], 00:21:30.414 "product_name": "Malloc disk", 00:21:30.414 "block_size": 512, 00:21:30.414 "num_blocks": 65536, 00:21:30.414 "uuid": "eb7ac3f6-360f-4340-89cd-67f8dab1b871", 00:21:30.414 "assigned_rate_limits": { 00:21:30.414 "rw_ios_per_sec": 0, 00:21:30.414 "rw_mbytes_per_sec": 0, 00:21:30.414 "r_mbytes_per_sec": 0, 00:21:30.414 "w_mbytes_per_sec": 0 00:21:30.414 }, 00:21:30.414 "claimed": true, 00:21:30.414 "claim_type": "exclusive_write", 00:21:30.414 "zoned": false, 00:21:30.414 "supported_io_types": { 00:21:30.414 "read": true, 00:21:30.414 "write": true, 00:21:30.414 "unmap": true, 00:21:30.414 "flush": true, 00:21:30.414 "reset": true, 00:21:30.414 "nvme_admin": false, 00:21:30.414 "nvme_io": false, 00:21:30.414 "nvme_io_md": false, 00:21:30.414 "write_zeroes": true, 00:21:30.414 "zcopy": true, 00:21:30.414 "get_zone_info": false, 00:21:30.414 "zone_management": false, 00:21:30.414 "zone_append": false, 00:21:30.414 "compare": false, 00:21:30.414 "compare_and_write": false, 00:21:30.414 "abort": true, 00:21:30.414 "seek_hole": false, 00:21:30.414 "seek_data": false, 00:21:30.414 "copy": true, 00:21:30.414 "nvme_iov_md": false 00:21:30.414 }, 00:21:30.414 "memory_domains": [ 00:21:30.414 { 00:21:30.414 "dma_device_id": "system", 00:21:30.414 "dma_device_type": 1 00:21:30.414 }, 00:21:30.414 { 00:21:30.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.414 "dma_device_type": 2 00:21:30.414 } 00:21:30.414 ], 00:21:30.414 "driver_specific": {} 00:21:30.414 } 00:21:30.414 ] 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.673 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:30.933 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.933 "name": "Existed_Raid", 00:21:30.933 "uuid": "3b8266fe-655e-4b21-b0c8-a4ddb7a8892b", 00:21:30.933 "strip_size_kb": 0, 00:21:30.933 "state": "online", 00:21:30.933 "raid_level": "raid1", 00:21:30.933 "superblock": false, 00:21:30.933 "num_base_bdevs": 4, 00:21:30.933 "num_base_bdevs_discovered": 4, 00:21:30.933 "num_base_bdevs_operational": 4, 00:21:30.933 "base_bdevs_list": [ 00:21:30.933 { 00:21:30.933 "name": "NewBaseBdev", 00:21:30.933 "uuid": "eb7ac3f6-360f-4340-89cd-67f8dab1b871", 00:21:30.933 "is_configured": true, 00:21:30.933 "data_offset": 0, 00:21:30.933 "data_size": 65536 00:21:30.933 }, 00:21:30.933 { 00:21:30.933 "name": "BaseBdev2", 00:21:30.933 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:30.933 "is_configured": true, 00:21:30.933 "data_offset": 0, 00:21:30.933 "data_size": 65536 00:21:30.933 }, 00:21:30.933 { 00:21:30.933 "name": "BaseBdev3", 00:21:30.933 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:30.933 "is_configured": true, 00:21:30.933 "data_offset": 0, 00:21:30.933 "data_size": 65536 00:21:30.933 }, 00:21:30.933 { 00:21:30.933 "name": "BaseBdev4", 00:21:30.933 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:30.933 "is_configured": true, 00:21:30.933 "data_offset": 0, 00:21:30.933 "data_size": 65536 00:21:30.933 } 00:21:30.933 ] 00:21:30.933 }' 00:21:30.933 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.933 13:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:31.498 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:31.498 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:31.498 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:31.498 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:31.498 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:31.498 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:31.498 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:31.498 13:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:31.756 [2024-07-12 13:47:20.099690] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:31.756 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:31.756 "name": "Existed_Raid", 00:21:31.756 "aliases": [ 00:21:31.756 "3b8266fe-655e-4b21-b0c8-a4ddb7a8892b" 00:21:31.756 ], 00:21:31.756 "product_name": "Raid Volume", 00:21:31.756 "block_size": 512, 00:21:31.756 "num_blocks": 65536, 00:21:31.756 "uuid": "3b8266fe-655e-4b21-b0c8-a4ddb7a8892b", 00:21:31.756 "assigned_rate_limits": { 00:21:31.756 "rw_ios_per_sec": 0, 00:21:31.756 "rw_mbytes_per_sec": 0, 00:21:31.756 "r_mbytes_per_sec": 0, 00:21:31.756 "w_mbytes_per_sec": 0 00:21:31.756 }, 00:21:31.756 "claimed": false, 00:21:31.756 "zoned": false, 00:21:31.756 "supported_io_types": { 00:21:31.756 "read": true, 00:21:31.756 "write": true, 00:21:31.756 "unmap": false, 00:21:31.756 "flush": false, 00:21:31.756 "reset": true, 00:21:31.756 "nvme_admin": false, 00:21:31.756 "nvme_io": false, 00:21:31.756 "nvme_io_md": false, 00:21:31.756 "write_zeroes": true, 00:21:31.756 "zcopy": false, 00:21:31.756 "get_zone_info": false, 00:21:31.756 "zone_management": false, 00:21:31.756 "zone_append": false, 00:21:31.756 "compare": false, 00:21:31.756 "compare_and_write": false, 00:21:31.756 "abort": false, 00:21:31.756 "seek_hole": false, 00:21:31.756 "seek_data": false, 00:21:31.756 "copy": false, 00:21:31.756 "nvme_iov_md": false 00:21:31.756 }, 00:21:31.756 "memory_domains": [ 00:21:31.756 { 00:21:31.756 "dma_device_id": "system", 00:21:31.756 "dma_device_type": 1 00:21:31.756 }, 00:21:31.756 { 00:21:31.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.756 "dma_device_type": 2 00:21:31.756 }, 00:21:31.756 { 00:21:31.756 "dma_device_id": "system", 00:21:31.756 "dma_device_type": 1 00:21:31.756 }, 00:21:31.756 { 00:21:31.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.756 "dma_device_type": 2 00:21:31.756 }, 00:21:31.756 { 00:21:31.756 "dma_device_id": "system", 00:21:31.756 "dma_device_type": 1 00:21:31.756 }, 00:21:31.756 { 00:21:31.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.756 "dma_device_type": 2 00:21:31.756 }, 00:21:31.756 { 00:21:31.756 "dma_device_id": "system", 00:21:31.756 "dma_device_type": 1 00:21:31.756 }, 00:21:31.756 { 00:21:31.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.756 "dma_device_type": 2 00:21:31.756 } 00:21:31.756 ], 00:21:31.756 "driver_specific": { 00:21:31.756 "raid": { 00:21:31.756 "uuid": "3b8266fe-655e-4b21-b0c8-a4ddb7a8892b", 00:21:31.756 "strip_size_kb": 0, 00:21:31.756 "state": "online", 00:21:31.756 "raid_level": "raid1", 00:21:31.756 "superblock": false, 00:21:31.756 "num_base_bdevs": 4, 00:21:31.756 "num_base_bdevs_discovered": 4, 00:21:31.756 "num_base_bdevs_operational": 4, 00:21:31.756 "base_bdevs_list": [ 00:21:31.756 { 00:21:31.756 "name": "NewBaseBdev", 00:21:31.756 "uuid": "eb7ac3f6-360f-4340-89cd-67f8dab1b871", 00:21:31.756 "is_configured": true, 00:21:31.756 "data_offset": 0, 00:21:31.756 "data_size": 65536 00:21:31.756 }, 00:21:31.756 { 00:21:31.756 "name": "BaseBdev2", 00:21:31.756 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:31.756 "is_configured": true, 00:21:31.756 "data_offset": 0, 00:21:31.756 "data_size": 65536 00:21:31.756 }, 00:21:31.756 { 00:21:31.756 "name": "BaseBdev3", 00:21:31.756 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:31.756 "is_configured": true, 00:21:31.756 "data_offset": 0, 00:21:31.756 "data_size": 65536 00:21:31.756 }, 00:21:31.756 { 00:21:31.756 "name": "BaseBdev4", 00:21:31.756 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:31.756 "is_configured": true, 00:21:31.756 "data_offset": 0, 00:21:31.756 "data_size": 65536 00:21:31.756 } 00:21:31.756 ] 00:21:31.756 } 00:21:31.757 } 00:21:31.757 }' 00:21:31.757 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:31.757 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:31.757 BaseBdev2 00:21:31.757 BaseBdev3 00:21:31.757 BaseBdev4' 00:21:31.757 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:31.757 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:31.757 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:32.015 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:32.015 "name": "NewBaseBdev", 00:21:32.015 "aliases": [ 00:21:32.015 "eb7ac3f6-360f-4340-89cd-67f8dab1b871" 00:21:32.015 ], 00:21:32.015 "product_name": "Malloc disk", 00:21:32.015 "block_size": 512, 00:21:32.015 "num_blocks": 65536, 00:21:32.015 "uuid": "eb7ac3f6-360f-4340-89cd-67f8dab1b871", 00:21:32.015 "assigned_rate_limits": { 00:21:32.015 "rw_ios_per_sec": 0, 00:21:32.015 "rw_mbytes_per_sec": 0, 00:21:32.015 "r_mbytes_per_sec": 0, 00:21:32.015 "w_mbytes_per_sec": 0 00:21:32.015 }, 00:21:32.015 "claimed": true, 00:21:32.015 "claim_type": "exclusive_write", 00:21:32.015 "zoned": false, 00:21:32.015 "supported_io_types": { 00:21:32.015 "read": true, 00:21:32.015 "write": true, 00:21:32.015 "unmap": true, 00:21:32.015 "flush": true, 00:21:32.015 "reset": true, 00:21:32.015 "nvme_admin": false, 00:21:32.015 "nvme_io": false, 00:21:32.015 "nvme_io_md": false, 00:21:32.015 "write_zeroes": true, 00:21:32.015 "zcopy": true, 00:21:32.015 "get_zone_info": false, 00:21:32.015 "zone_management": false, 00:21:32.015 "zone_append": false, 00:21:32.015 "compare": false, 00:21:32.015 "compare_and_write": false, 00:21:32.015 "abort": true, 00:21:32.015 "seek_hole": false, 00:21:32.015 "seek_data": false, 00:21:32.015 "copy": true, 00:21:32.015 "nvme_iov_md": false 00:21:32.015 }, 00:21:32.015 "memory_domains": [ 00:21:32.015 { 00:21:32.015 "dma_device_id": "system", 00:21:32.015 "dma_device_type": 1 00:21:32.015 }, 00:21:32.015 { 00:21:32.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.015 "dma_device_type": 2 00:21:32.015 } 00:21:32.015 ], 00:21:32.015 "driver_specific": {} 00:21:32.015 }' 00:21:32.015 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.015 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.015 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:32.015 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.015 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.274 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:32.274 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.274 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.274 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:32.274 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.274 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.274 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:32.274 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:32.274 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:32.274 13:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:32.532 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:32.532 "name": "BaseBdev2", 00:21:32.532 "aliases": [ 00:21:32.532 "73b96b56-8d94-49bc-bbc8-b84d3bd43552" 00:21:32.532 ], 00:21:32.532 "product_name": "Malloc disk", 00:21:32.532 "block_size": 512, 00:21:32.532 "num_blocks": 65536, 00:21:32.532 "uuid": "73b96b56-8d94-49bc-bbc8-b84d3bd43552", 00:21:32.532 "assigned_rate_limits": { 00:21:32.532 "rw_ios_per_sec": 0, 00:21:32.532 "rw_mbytes_per_sec": 0, 00:21:32.532 "r_mbytes_per_sec": 0, 00:21:32.532 "w_mbytes_per_sec": 0 00:21:32.532 }, 00:21:32.532 "claimed": true, 00:21:32.532 "claim_type": "exclusive_write", 00:21:32.532 "zoned": false, 00:21:32.532 "supported_io_types": { 00:21:32.532 "read": true, 00:21:32.532 "write": true, 00:21:32.532 "unmap": true, 00:21:32.532 "flush": true, 00:21:32.532 "reset": true, 00:21:32.532 "nvme_admin": false, 00:21:32.532 "nvme_io": false, 00:21:32.532 "nvme_io_md": false, 00:21:32.532 "write_zeroes": true, 00:21:32.532 "zcopy": true, 00:21:32.532 "get_zone_info": false, 00:21:32.532 "zone_management": false, 00:21:32.532 "zone_append": false, 00:21:32.532 "compare": false, 00:21:32.532 "compare_and_write": false, 00:21:32.532 "abort": true, 00:21:32.532 "seek_hole": false, 00:21:32.532 "seek_data": false, 00:21:32.532 "copy": true, 00:21:32.532 "nvme_iov_md": false 00:21:32.532 }, 00:21:32.532 "memory_domains": [ 00:21:32.532 { 00:21:32.532 "dma_device_id": "system", 00:21:32.532 "dma_device_type": 1 00:21:32.532 }, 00:21:32.532 { 00:21:32.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.532 "dma_device_type": 2 00:21:32.532 } 00:21:32.532 ], 00:21:32.532 "driver_specific": {} 00:21:32.532 }' 00:21:32.532 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.532 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.532 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:32.532 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.791 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.791 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:32.791 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.791 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.791 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:32.791 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.791 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.049 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:33.049 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:33.049 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:33.049 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:33.308 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:33.308 "name": "BaseBdev3", 00:21:33.308 "aliases": [ 00:21:33.308 "8213e432-ef4f-4136-a3f3-3960feebb450" 00:21:33.308 ], 00:21:33.308 "product_name": "Malloc disk", 00:21:33.308 "block_size": 512, 00:21:33.308 "num_blocks": 65536, 00:21:33.308 "uuid": "8213e432-ef4f-4136-a3f3-3960feebb450", 00:21:33.308 "assigned_rate_limits": { 00:21:33.308 "rw_ios_per_sec": 0, 00:21:33.308 "rw_mbytes_per_sec": 0, 00:21:33.308 "r_mbytes_per_sec": 0, 00:21:33.308 "w_mbytes_per_sec": 0 00:21:33.308 }, 00:21:33.308 "claimed": true, 00:21:33.308 "claim_type": "exclusive_write", 00:21:33.308 "zoned": false, 00:21:33.308 "supported_io_types": { 00:21:33.308 "read": true, 00:21:33.308 "write": true, 00:21:33.308 "unmap": true, 00:21:33.308 "flush": true, 00:21:33.308 "reset": true, 00:21:33.308 "nvme_admin": false, 00:21:33.308 "nvme_io": false, 00:21:33.308 "nvme_io_md": false, 00:21:33.308 "write_zeroes": true, 00:21:33.308 "zcopy": true, 00:21:33.308 "get_zone_info": false, 00:21:33.308 "zone_management": false, 00:21:33.308 "zone_append": false, 00:21:33.308 "compare": false, 00:21:33.308 "compare_and_write": false, 00:21:33.308 "abort": true, 00:21:33.308 "seek_hole": false, 00:21:33.308 "seek_data": false, 00:21:33.308 "copy": true, 00:21:33.308 "nvme_iov_md": false 00:21:33.308 }, 00:21:33.309 "memory_domains": [ 00:21:33.309 { 00:21:33.309 "dma_device_id": "system", 00:21:33.309 "dma_device_type": 1 00:21:33.309 }, 00:21:33.309 { 00:21:33.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.309 "dma_device_type": 2 00:21:33.309 } 00:21:33.309 ], 00:21:33.309 "driver_specific": {} 00:21:33.309 }' 00:21:33.309 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.309 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.309 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:33.309 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.309 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.309 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:33.309 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.309 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.568 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:33.568 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.568 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.568 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:33.568 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:33.568 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:33.568 13:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:33.828 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:33.828 "name": "BaseBdev4", 00:21:33.828 "aliases": [ 00:21:33.828 "308ba754-8956-466f-83e3-dd81f3fb9d3b" 00:21:33.828 ], 00:21:33.828 "product_name": "Malloc disk", 00:21:33.828 "block_size": 512, 00:21:33.828 "num_blocks": 65536, 00:21:33.828 "uuid": "308ba754-8956-466f-83e3-dd81f3fb9d3b", 00:21:33.828 "assigned_rate_limits": { 00:21:33.828 "rw_ios_per_sec": 0, 00:21:33.828 "rw_mbytes_per_sec": 0, 00:21:33.828 "r_mbytes_per_sec": 0, 00:21:33.828 "w_mbytes_per_sec": 0 00:21:33.828 }, 00:21:33.828 "claimed": true, 00:21:33.828 "claim_type": "exclusive_write", 00:21:33.828 "zoned": false, 00:21:33.828 "supported_io_types": { 00:21:33.828 "read": true, 00:21:33.828 "write": true, 00:21:33.828 "unmap": true, 00:21:33.828 "flush": true, 00:21:33.828 "reset": true, 00:21:33.828 "nvme_admin": false, 00:21:33.828 "nvme_io": false, 00:21:33.828 "nvme_io_md": false, 00:21:33.828 "write_zeroes": true, 00:21:33.828 "zcopy": true, 00:21:33.828 "get_zone_info": false, 00:21:33.828 "zone_management": false, 00:21:33.828 "zone_append": false, 00:21:33.828 "compare": false, 00:21:33.828 "compare_and_write": false, 00:21:33.828 "abort": true, 00:21:33.828 "seek_hole": false, 00:21:33.828 "seek_data": false, 00:21:33.828 "copy": true, 00:21:33.828 "nvme_iov_md": false 00:21:33.828 }, 00:21:33.828 "memory_domains": [ 00:21:33.828 { 00:21:33.828 "dma_device_id": "system", 00:21:33.828 "dma_device_type": 1 00:21:33.828 }, 00:21:33.828 { 00:21:33.828 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.828 "dma_device_type": 2 00:21:33.828 } 00:21:33.828 ], 00:21:33.828 "driver_specific": {} 00:21:33.828 }' 00:21:33.828 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.828 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.828 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:33.828 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.828 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.087 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:34.087 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.087 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.087 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:34.087 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.087 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.087 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:34.087 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:34.346 [2024-07-12 13:47:22.826634] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:34.346 [2024-07-12 13:47:22.826662] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:34.346 [2024-07-12 13:47:22.826714] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:34.346 [2024-07-12 13:47:22.827007] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:34.346 [2024-07-12 13:47:22.827021] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1029910 name Existed_Raid, state offline 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 521050 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 521050 ']' 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 521050 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 521050 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 521050' 00:21:34.346 killing process with pid 521050 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 521050 00:21:34.346 [2024-07-12 13:47:22.900361] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:34.346 13:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 521050 00:21:34.606 [2024-07-12 13:47:22.938307] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:34.606 13:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:21:34.606 00:21:34.606 real 0m33.460s 00:21:34.606 user 1m1.469s 00:21:34.606 sys 0m5.944s 00:21:34.606 13:47:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:34.606 13:47:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:34.606 ************************************ 00:21:34.606 END TEST raid_state_function_test 00:21:34.606 ************************************ 00:21:34.865 13:47:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:34.865 13:47:23 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:21:34.865 13:47:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:34.865 13:47:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:34.865 13:47:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:34.865 ************************************ 00:21:34.865 START TEST raid_state_function_test_sb 00:21:34.865 ************************************ 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=525947 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 525947' 00:21:34.865 Process raid pid: 525947 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 525947 /var/tmp/spdk-raid.sock 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 525947 ']' 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:34.865 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:34.865 13:47:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:34.865 [2024-07-12 13:47:23.315086] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:21:34.865 [2024-07-12 13:47:23.315167] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:35.124 [2024-07-12 13:47:23.461576] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:35.124 [2024-07-12 13:47:23.568569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:35.124 [2024-07-12 13:47:23.631181] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:35.124 [2024-07-12 13:47:23.631216] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:35.692 13:47:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:35.692 13:47:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:35.692 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:35.951 [2024-07-12 13:47:24.470605] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:35.951 [2024-07-12 13:47:24.470646] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:35.951 [2024-07-12 13:47:24.470657] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:35.951 [2024-07-12 13:47:24.470670] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:35.951 [2024-07-12 13:47:24.470680] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:35.951 [2024-07-12 13:47:24.470691] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:35.951 [2024-07-12 13:47:24.470700] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:35.951 [2024-07-12 13:47:24.470712] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.951 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:36.210 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:36.210 "name": "Existed_Raid", 00:21:36.210 "uuid": "345df64a-da7a-4fcb-ada9-c2df85e6b8b7", 00:21:36.210 "strip_size_kb": 0, 00:21:36.210 "state": "configuring", 00:21:36.210 "raid_level": "raid1", 00:21:36.210 "superblock": true, 00:21:36.210 "num_base_bdevs": 4, 00:21:36.210 "num_base_bdevs_discovered": 0, 00:21:36.210 "num_base_bdevs_operational": 4, 00:21:36.210 "base_bdevs_list": [ 00:21:36.210 { 00:21:36.210 "name": "BaseBdev1", 00:21:36.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.210 "is_configured": false, 00:21:36.210 "data_offset": 0, 00:21:36.210 "data_size": 0 00:21:36.210 }, 00:21:36.210 { 00:21:36.210 "name": "BaseBdev2", 00:21:36.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.210 "is_configured": false, 00:21:36.210 "data_offset": 0, 00:21:36.210 "data_size": 0 00:21:36.210 }, 00:21:36.210 { 00:21:36.210 "name": "BaseBdev3", 00:21:36.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.210 "is_configured": false, 00:21:36.210 "data_offset": 0, 00:21:36.210 "data_size": 0 00:21:36.210 }, 00:21:36.210 { 00:21:36.210 "name": "BaseBdev4", 00:21:36.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.210 "is_configured": false, 00:21:36.210 "data_offset": 0, 00:21:36.210 "data_size": 0 00:21:36.210 } 00:21:36.210 ] 00:21:36.210 }' 00:21:36.210 13:47:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:36.210 13:47:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:36.776 13:47:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:37.034 [2024-07-12 13:47:25.489152] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:37.034 [2024-07-12 13:47:25.489185] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ea2370 name Existed_Raid, state configuring 00:21:37.034 13:47:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:37.293 [2024-07-12 13:47:25.669658] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:37.293 [2024-07-12 13:47:25.669683] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:37.293 [2024-07-12 13:47:25.669692] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:37.293 [2024-07-12 13:47:25.669704] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:37.293 [2024-07-12 13:47:25.669712] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:37.293 [2024-07-12 13:47:25.669723] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:37.293 [2024-07-12 13:47:25.669732] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:37.293 [2024-07-12 13:47:25.669743] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:37.293 13:47:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:37.551 [2024-07-12 13:47:25.928205] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:37.551 BaseBdev1 00:21:37.551 13:47:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:37.551 13:47:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:37.551 13:47:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:37.551 13:47:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:37.551 13:47:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:37.551 13:47:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:37.551 13:47:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:37.551 13:47:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:37.809 [ 00:21:37.809 { 00:21:37.809 "name": "BaseBdev1", 00:21:37.809 "aliases": [ 00:21:37.809 "647c5e20-cf67-496a-b2f8-47a5f2716d7e" 00:21:37.809 ], 00:21:37.809 "product_name": "Malloc disk", 00:21:37.809 "block_size": 512, 00:21:37.809 "num_blocks": 65536, 00:21:37.809 "uuid": "647c5e20-cf67-496a-b2f8-47a5f2716d7e", 00:21:37.809 "assigned_rate_limits": { 00:21:37.809 "rw_ios_per_sec": 0, 00:21:37.809 "rw_mbytes_per_sec": 0, 00:21:37.809 "r_mbytes_per_sec": 0, 00:21:37.809 "w_mbytes_per_sec": 0 00:21:37.809 }, 00:21:37.809 "claimed": true, 00:21:37.809 "claim_type": "exclusive_write", 00:21:37.809 "zoned": false, 00:21:37.809 "supported_io_types": { 00:21:37.809 "read": true, 00:21:37.809 "write": true, 00:21:37.809 "unmap": true, 00:21:37.809 "flush": true, 00:21:37.809 "reset": true, 00:21:37.809 "nvme_admin": false, 00:21:37.809 "nvme_io": false, 00:21:37.809 "nvme_io_md": false, 00:21:37.809 "write_zeroes": true, 00:21:37.809 "zcopy": true, 00:21:37.809 "get_zone_info": false, 00:21:37.809 "zone_management": false, 00:21:37.809 "zone_append": false, 00:21:37.809 "compare": false, 00:21:37.809 "compare_and_write": false, 00:21:37.809 "abort": true, 00:21:37.809 "seek_hole": false, 00:21:37.809 "seek_data": false, 00:21:37.809 "copy": true, 00:21:37.809 "nvme_iov_md": false 00:21:37.809 }, 00:21:37.809 "memory_domains": [ 00:21:37.809 { 00:21:37.809 "dma_device_id": "system", 00:21:37.809 "dma_device_type": 1 00:21:37.809 }, 00:21:37.809 { 00:21:37.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.809 "dma_device_type": 2 00:21:37.809 } 00:21:37.809 ], 00:21:37.809 "driver_specific": {} 00:21:37.809 } 00:21:37.809 ] 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.809 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:38.067 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:38.067 "name": "Existed_Raid", 00:21:38.067 "uuid": "2b2a08af-8d60-4824-aa84-da9f58a28d2a", 00:21:38.067 "strip_size_kb": 0, 00:21:38.067 "state": "configuring", 00:21:38.067 "raid_level": "raid1", 00:21:38.067 "superblock": true, 00:21:38.067 "num_base_bdevs": 4, 00:21:38.067 "num_base_bdevs_discovered": 1, 00:21:38.067 "num_base_bdevs_operational": 4, 00:21:38.067 "base_bdevs_list": [ 00:21:38.067 { 00:21:38.067 "name": "BaseBdev1", 00:21:38.067 "uuid": "647c5e20-cf67-496a-b2f8-47a5f2716d7e", 00:21:38.067 "is_configured": true, 00:21:38.067 "data_offset": 2048, 00:21:38.067 "data_size": 63488 00:21:38.067 }, 00:21:38.067 { 00:21:38.067 "name": "BaseBdev2", 00:21:38.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.067 "is_configured": false, 00:21:38.067 "data_offset": 0, 00:21:38.067 "data_size": 0 00:21:38.067 }, 00:21:38.067 { 00:21:38.067 "name": "BaseBdev3", 00:21:38.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.067 "is_configured": false, 00:21:38.067 "data_offset": 0, 00:21:38.067 "data_size": 0 00:21:38.067 }, 00:21:38.067 { 00:21:38.067 "name": "BaseBdev4", 00:21:38.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.067 "is_configured": false, 00:21:38.067 "data_offset": 0, 00:21:38.067 "data_size": 0 00:21:38.067 } 00:21:38.067 ] 00:21:38.067 }' 00:21:38.067 13:47:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:38.067 13:47:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:39.002 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:39.002 [2024-07-12 13:47:27.472293] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:39.002 [2024-07-12 13:47:27.472329] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ea1be0 name Existed_Raid, state configuring 00:21:39.002 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:39.260 [2024-07-12 13:47:27.720997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:39.260 [2024-07-12 13:47:27.722416] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:39.260 [2024-07-12 13:47:27.722452] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:39.260 [2024-07-12 13:47:27.722463] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:39.260 [2024-07-12 13:47:27.722475] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:39.260 [2024-07-12 13:47:27.722484] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:39.260 [2024-07-12 13:47:27.722496] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:39.260 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:39.260 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:39.260 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.261 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:39.519 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.519 "name": "Existed_Raid", 00:21:39.519 "uuid": "b76532d6-5901-41cb-a044-4fba5e9871d6", 00:21:39.519 "strip_size_kb": 0, 00:21:39.519 "state": "configuring", 00:21:39.519 "raid_level": "raid1", 00:21:39.519 "superblock": true, 00:21:39.519 "num_base_bdevs": 4, 00:21:39.519 "num_base_bdevs_discovered": 1, 00:21:39.519 "num_base_bdevs_operational": 4, 00:21:39.519 "base_bdevs_list": [ 00:21:39.519 { 00:21:39.519 "name": "BaseBdev1", 00:21:39.519 "uuid": "647c5e20-cf67-496a-b2f8-47a5f2716d7e", 00:21:39.519 "is_configured": true, 00:21:39.519 "data_offset": 2048, 00:21:39.519 "data_size": 63488 00:21:39.519 }, 00:21:39.519 { 00:21:39.519 "name": "BaseBdev2", 00:21:39.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.519 "is_configured": false, 00:21:39.519 "data_offset": 0, 00:21:39.519 "data_size": 0 00:21:39.519 }, 00:21:39.519 { 00:21:39.519 "name": "BaseBdev3", 00:21:39.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.519 "is_configured": false, 00:21:39.519 "data_offset": 0, 00:21:39.519 "data_size": 0 00:21:39.519 }, 00:21:39.519 { 00:21:39.519 "name": "BaseBdev4", 00:21:39.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.519 "is_configured": false, 00:21:39.519 "data_offset": 0, 00:21:39.519 "data_size": 0 00:21:39.519 } 00:21:39.519 ] 00:21:39.519 }' 00:21:39.519 13:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.519 13:47:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:40.087 13:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:40.345 [2024-07-12 13:47:28.811273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:40.345 BaseBdev2 00:21:40.345 13:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:40.345 13:47:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:40.345 13:47:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:40.345 13:47:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:40.345 13:47:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:40.345 13:47:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:40.345 13:47:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:40.604 13:47:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:40.863 [ 00:21:40.863 { 00:21:40.863 "name": "BaseBdev2", 00:21:40.863 "aliases": [ 00:21:40.863 "1b1ada44-af51-4f2e-af62-b7a24012e015" 00:21:40.863 ], 00:21:40.863 "product_name": "Malloc disk", 00:21:40.863 "block_size": 512, 00:21:40.863 "num_blocks": 65536, 00:21:40.863 "uuid": "1b1ada44-af51-4f2e-af62-b7a24012e015", 00:21:40.863 "assigned_rate_limits": { 00:21:40.863 "rw_ios_per_sec": 0, 00:21:40.863 "rw_mbytes_per_sec": 0, 00:21:40.863 "r_mbytes_per_sec": 0, 00:21:40.863 "w_mbytes_per_sec": 0 00:21:40.863 }, 00:21:40.863 "claimed": true, 00:21:40.863 "claim_type": "exclusive_write", 00:21:40.863 "zoned": false, 00:21:40.863 "supported_io_types": { 00:21:40.863 "read": true, 00:21:40.863 "write": true, 00:21:40.863 "unmap": true, 00:21:40.863 "flush": true, 00:21:40.863 "reset": true, 00:21:40.863 "nvme_admin": false, 00:21:40.863 "nvme_io": false, 00:21:40.863 "nvme_io_md": false, 00:21:40.863 "write_zeroes": true, 00:21:40.863 "zcopy": true, 00:21:40.863 "get_zone_info": false, 00:21:40.863 "zone_management": false, 00:21:40.863 "zone_append": false, 00:21:40.863 "compare": false, 00:21:40.863 "compare_and_write": false, 00:21:40.863 "abort": true, 00:21:40.863 "seek_hole": false, 00:21:40.863 "seek_data": false, 00:21:40.863 "copy": true, 00:21:40.863 "nvme_iov_md": false 00:21:40.863 }, 00:21:40.863 "memory_domains": [ 00:21:40.863 { 00:21:40.863 "dma_device_id": "system", 00:21:40.863 "dma_device_type": 1 00:21:40.863 }, 00:21:40.863 { 00:21:40.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:40.863 "dma_device_type": 2 00:21:40.863 } 00:21:40.863 ], 00:21:40.863 "driver_specific": {} 00:21:40.863 } 00:21:40.863 ] 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.863 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:41.122 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.122 "name": "Existed_Raid", 00:21:41.122 "uuid": "b76532d6-5901-41cb-a044-4fba5e9871d6", 00:21:41.122 "strip_size_kb": 0, 00:21:41.122 "state": "configuring", 00:21:41.122 "raid_level": "raid1", 00:21:41.122 "superblock": true, 00:21:41.122 "num_base_bdevs": 4, 00:21:41.122 "num_base_bdevs_discovered": 2, 00:21:41.122 "num_base_bdevs_operational": 4, 00:21:41.122 "base_bdevs_list": [ 00:21:41.122 { 00:21:41.122 "name": "BaseBdev1", 00:21:41.122 "uuid": "647c5e20-cf67-496a-b2f8-47a5f2716d7e", 00:21:41.122 "is_configured": true, 00:21:41.122 "data_offset": 2048, 00:21:41.122 "data_size": 63488 00:21:41.122 }, 00:21:41.122 { 00:21:41.122 "name": "BaseBdev2", 00:21:41.122 "uuid": "1b1ada44-af51-4f2e-af62-b7a24012e015", 00:21:41.122 "is_configured": true, 00:21:41.122 "data_offset": 2048, 00:21:41.122 "data_size": 63488 00:21:41.122 }, 00:21:41.122 { 00:21:41.122 "name": "BaseBdev3", 00:21:41.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:41.122 "is_configured": false, 00:21:41.122 "data_offset": 0, 00:21:41.122 "data_size": 0 00:21:41.122 }, 00:21:41.122 { 00:21:41.122 "name": "BaseBdev4", 00:21:41.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:41.122 "is_configured": false, 00:21:41.122 "data_offset": 0, 00:21:41.122 "data_size": 0 00:21:41.122 } 00:21:41.122 ] 00:21:41.122 }' 00:21:41.122 13:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.122 13:47:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:41.690 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:41.949 [2024-07-12 13:47:30.346807] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:41.949 BaseBdev3 00:21:41.949 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:41.949 13:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:41.949 13:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:41.949 13:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:41.949 13:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:41.949 13:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:41.949 13:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:42.209 13:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:42.209 [ 00:21:42.209 { 00:21:42.209 "name": "BaseBdev3", 00:21:42.209 "aliases": [ 00:21:42.209 "bb35dd1f-ceb6-4ec3-b5b3-0c446229e96b" 00:21:42.209 ], 00:21:42.209 "product_name": "Malloc disk", 00:21:42.209 "block_size": 512, 00:21:42.209 "num_blocks": 65536, 00:21:42.209 "uuid": "bb35dd1f-ceb6-4ec3-b5b3-0c446229e96b", 00:21:42.209 "assigned_rate_limits": { 00:21:42.209 "rw_ios_per_sec": 0, 00:21:42.209 "rw_mbytes_per_sec": 0, 00:21:42.209 "r_mbytes_per_sec": 0, 00:21:42.209 "w_mbytes_per_sec": 0 00:21:42.209 }, 00:21:42.209 "claimed": true, 00:21:42.209 "claim_type": "exclusive_write", 00:21:42.209 "zoned": false, 00:21:42.209 "supported_io_types": { 00:21:42.209 "read": true, 00:21:42.209 "write": true, 00:21:42.209 "unmap": true, 00:21:42.209 "flush": true, 00:21:42.209 "reset": true, 00:21:42.209 "nvme_admin": false, 00:21:42.209 "nvme_io": false, 00:21:42.209 "nvme_io_md": false, 00:21:42.209 "write_zeroes": true, 00:21:42.209 "zcopy": true, 00:21:42.209 "get_zone_info": false, 00:21:42.209 "zone_management": false, 00:21:42.209 "zone_append": false, 00:21:42.209 "compare": false, 00:21:42.209 "compare_and_write": false, 00:21:42.209 "abort": true, 00:21:42.209 "seek_hole": false, 00:21:42.209 "seek_data": false, 00:21:42.209 "copy": true, 00:21:42.209 "nvme_iov_md": false 00:21:42.209 }, 00:21:42.209 "memory_domains": [ 00:21:42.209 { 00:21:42.209 "dma_device_id": "system", 00:21:42.209 "dma_device_type": 1 00:21:42.209 }, 00:21:42.209 { 00:21:42.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.209 "dma_device_type": 2 00:21:42.209 } 00:21:42.209 ], 00:21:42.210 "driver_specific": {} 00:21:42.210 } 00:21:42.210 ] 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:42.210 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.469 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.469 "name": "Existed_Raid", 00:21:42.469 "uuid": "b76532d6-5901-41cb-a044-4fba5e9871d6", 00:21:42.469 "strip_size_kb": 0, 00:21:42.469 "state": "configuring", 00:21:42.469 "raid_level": "raid1", 00:21:42.469 "superblock": true, 00:21:42.469 "num_base_bdevs": 4, 00:21:42.469 "num_base_bdevs_discovered": 3, 00:21:42.469 "num_base_bdevs_operational": 4, 00:21:42.469 "base_bdevs_list": [ 00:21:42.469 { 00:21:42.469 "name": "BaseBdev1", 00:21:42.469 "uuid": "647c5e20-cf67-496a-b2f8-47a5f2716d7e", 00:21:42.469 "is_configured": true, 00:21:42.469 "data_offset": 2048, 00:21:42.469 "data_size": 63488 00:21:42.469 }, 00:21:42.469 { 00:21:42.469 "name": "BaseBdev2", 00:21:42.469 "uuid": "1b1ada44-af51-4f2e-af62-b7a24012e015", 00:21:42.469 "is_configured": true, 00:21:42.469 "data_offset": 2048, 00:21:42.469 "data_size": 63488 00:21:42.469 }, 00:21:42.469 { 00:21:42.469 "name": "BaseBdev3", 00:21:42.469 "uuid": "bb35dd1f-ceb6-4ec3-b5b3-0c446229e96b", 00:21:42.469 "is_configured": true, 00:21:42.469 "data_offset": 2048, 00:21:42.469 "data_size": 63488 00:21:42.469 }, 00:21:42.469 { 00:21:42.469 "name": "BaseBdev4", 00:21:42.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.469 "is_configured": false, 00:21:42.469 "data_offset": 0, 00:21:42.469 "data_size": 0 00:21:42.469 } 00:21:42.469 ] 00:21:42.469 }' 00:21:42.469 13:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.469 13:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:43.037 13:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:43.296 [2024-07-12 13:47:31.721868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:43.296 [2024-07-12 13:47:31.722046] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ea2c40 00:21:43.296 [2024-07-12 13:47:31.722062] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:43.296 [2024-07-12 13:47:31.722233] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ea38c0 00:21:43.296 [2024-07-12 13:47:31.722359] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ea2c40 00:21:43.296 [2024-07-12 13:47:31.722370] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ea2c40 00:21:43.296 [2024-07-12 13:47:31.722463] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:43.296 BaseBdev4 00:21:43.296 13:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:43.296 13:47:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:43.297 13:47:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:43.297 13:47:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:43.297 13:47:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:43.297 13:47:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:43.297 13:47:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:43.556 13:47:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:43.556 [ 00:21:43.556 { 00:21:43.556 "name": "BaseBdev4", 00:21:43.556 "aliases": [ 00:21:43.556 "7a1058c0-befb-4394-bd35-d9c69f08bdde" 00:21:43.556 ], 00:21:43.556 "product_name": "Malloc disk", 00:21:43.556 "block_size": 512, 00:21:43.556 "num_blocks": 65536, 00:21:43.556 "uuid": "7a1058c0-befb-4394-bd35-d9c69f08bdde", 00:21:43.556 "assigned_rate_limits": { 00:21:43.556 "rw_ios_per_sec": 0, 00:21:43.556 "rw_mbytes_per_sec": 0, 00:21:43.556 "r_mbytes_per_sec": 0, 00:21:43.556 "w_mbytes_per_sec": 0 00:21:43.556 }, 00:21:43.556 "claimed": true, 00:21:43.556 "claim_type": "exclusive_write", 00:21:43.556 "zoned": false, 00:21:43.556 "supported_io_types": { 00:21:43.556 "read": true, 00:21:43.556 "write": true, 00:21:43.556 "unmap": true, 00:21:43.556 "flush": true, 00:21:43.556 "reset": true, 00:21:43.556 "nvme_admin": false, 00:21:43.556 "nvme_io": false, 00:21:43.556 "nvme_io_md": false, 00:21:43.556 "write_zeroes": true, 00:21:43.556 "zcopy": true, 00:21:43.556 "get_zone_info": false, 00:21:43.556 "zone_management": false, 00:21:43.556 "zone_append": false, 00:21:43.556 "compare": false, 00:21:43.556 "compare_and_write": false, 00:21:43.556 "abort": true, 00:21:43.556 "seek_hole": false, 00:21:43.556 "seek_data": false, 00:21:43.556 "copy": true, 00:21:43.556 "nvme_iov_md": false 00:21:43.556 }, 00:21:43.556 "memory_domains": [ 00:21:43.556 { 00:21:43.556 "dma_device_id": "system", 00:21:43.556 "dma_device_type": 1 00:21:43.556 }, 00:21:43.556 { 00:21:43.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.556 "dma_device_type": 2 00:21:43.556 } 00:21:43.556 ], 00:21:43.556 "driver_specific": {} 00:21:43.556 } 00:21:43.556 ] 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.556 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:43.815 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:43.815 "name": "Existed_Raid", 00:21:43.815 "uuid": "b76532d6-5901-41cb-a044-4fba5e9871d6", 00:21:43.815 "strip_size_kb": 0, 00:21:43.815 "state": "online", 00:21:43.815 "raid_level": "raid1", 00:21:43.815 "superblock": true, 00:21:43.815 "num_base_bdevs": 4, 00:21:43.815 "num_base_bdevs_discovered": 4, 00:21:43.815 "num_base_bdevs_operational": 4, 00:21:43.815 "base_bdevs_list": [ 00:21:43.815 { 00:21:43.815 "name": "BaseBdev1", 00:21:43.815 "uuid": "647c5e20-cf67-496a-b2f8-47a5f2716d7e", 00:21:43.815 "is_configured": true, 00:21:43.815 "data_offset": 2048, 00:21:43.815 "data_size": 63488 00:21:43.815 }, 00:21:43.815 { 00:21:43.815 "name": "BaseBdev2", 00:21:43.815 "uuid": "1b1ada44-af51-4f2e-af62-b7a24012e015", 00:21:43.815 "is_configured": true, 00:21:43.815 "data_offset": 2048, 00:21:43.815 "data_size": 63488 00:21:43.815 }, 00:21:43.815 { 00:21:43.815 "name": "BaseBdev3", 00:21:43.815 "uuid": "bb35dd1f-ceb6-4ec3-b5b3-0c446229e96b", 00:21:43.815 "is_configured": true, 00:21:43.815 "data_offset": 2048, 00:21:43.815 "data_size": 63488 00:21:43.815 }, 00:21:43.815 { 00:21:43.815 "name": "BaseBdev4", 00:21:43.815 "uuid": "7a1058c0-befb-4394-bd35-d9c69f08bdde", 00:21:43.815 "is_configured": true, 00:21:43.815 "data_offset": 2048, 00:21:43.815 "data_size": 63488 00:21:43.815 } 00:21:43.815 ] 00:21:43.815 }' 00:21:43.815 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:43.815 13:47:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:44.752 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:44.752 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:44.752 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:44.752 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:44.752 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:44.752 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:44.752 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:44.752 13:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:44.752 [2024-07-12 13:47:33.206153] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:44.752 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:44.752 "name": "Existed_Raid", 00:21:44.752 "aliases": [ 00:21:44.752 "b76532d6-5901-41cb-a044-4fba5e9871d6" 00:21:44.752 ], 00:21:44.752 "product_name": "Raid Volume", 00:21:44.752 "block_size": 512, 00:21:44.752 "num_blocks": 63488, 00:21:44.752 "uuid": "b76532d6-5901-41cb-a044-4fba5e9871d6", 00:21:44.752 "assigned_rate_limits": { 00:21:44.752 "rw_ios_per_sec": 0, 00:21:44.752 "rw_mbytes_per_sec": 0, 00:21:44.752 "r_mbytes_per_sec": 0, 00:21:44.752 "w_mbytes_per_sec": 0 00:21:44.752 }, 00:21:44.752 "claimed": false, 00:21:44.752 "zoned": false, 00:21:44.752 "supported_io_types": { 00:21:44.752 "read": true, 00:21:44.752 "write": true, 00:21:44.752 "unmap": false, 00:21:44.752 "flush": false, 00:21:44.752 "reset": true, 00:21:44.752 "nvme_admin": false, 00:21:44.752 "nvme_io": false, 00:21:44.752 "nvme_io_md": false, 00:21:44.752 "write_zeroes": true, 00:21:44.752 "zcopy": false, 00:21:44.752 "get_zone_info": false, 00:21:44.752 "zone_management": false, 00:21:44.752 "zone_append": false, 00:21:44.752 "compare": false, 00:21:44.752 "compare_and_write": false, 00:21:44.752 "abort": false, 00:21:44.752 "seek_hole": false, 00:21:44.752 "seek_data": false, 00:21:44.752 "copy": false, 00:21:44.752 "nvme_iov_md": false 00:21:44.752 }, 00:21:44.752 "memory_domains": [ 00:21:44.752 { 00:21:44.752 "dma_device_id": "system", 00:21:44.752 "dma_device_type": 1 00:21:44.752 }, 00:21:44.752 { 00:21:44.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.752 "dma_device_type": 2 00:21:44.752 }, 00:21:44.752 { 00:21:44.752 "dma_device_id": "system", 00:21:44.752 "dma_device_type": 1 00:21:44.752 }, 00:21:44.752 { 00:21:44.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.752 "dma_device_type": 2 00:21:44.752 }, 00:21:44.752 { 00:21:44.752 "dma_device_id": "system", 00:21:44.752 "dma_device_type": 1 00:21:44.752 }, 00:21:44.752 { 00:21:44.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.752 "dma_device_type": 2 00:21:44.752 }, 00:21:44.752 { 00:21:44.752 "dma_device_id": "system", 00:21:44.752 "dma_device_type": 1 00:21:44.752 }, 00:21:44.752 { 00:21:44.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.752 "dma_device_type": 2 00:21:44.752 } 00:21:44.752 ], 00:21:44.752 "driver_specific": { 00:21:44.752 "raid": { 00:21:44.752 "uuid": "b76532d6-5901-41cb-a044-4fba5e9871d6", 00:21:44.752 "strip_size_kb": 0, 00:21:44.752 "state": "online", 00:21:44.752 "raid_level": "raid1", 00:21:44.752 "superblock": true, 00:21:44.752 "num_base_bdevs": 4, 00:21:44.752 "num_base_bdevs_discovered": 4, 00:21:44.752 "num_base_bdevs_operational": 4, 00:21:44.752 "base_bdevs_list": [ 00:21:44.752 { 00:21:44.752 "name": "BaseBdev1", 00:21:44.752 "uuid": "647c5e20-cf67-496a-b2f8-47a5f2716d7e", 00:21:44.752 "is_configured": true, 00:21:44.752 "data_offset": 2048, 00:21:44.752 "data_size": 63488 00:21:44.752 }, 00:21:44.752 { 00:21:44.752 "name": "BaseBdev2", 00:21:44.752 "uuid": "1b1ada44-af51-4f2e-af62-b7a24012e015", 00:21:44.752 "is_configured": true, 00:21:44.752 "data_offset": 2048, 00:21:44.752 "data_size": 63488 00:21:44.752 }, 00:21:44.752 { 00:21:44.752 "name": "BaseBdev3", 00:21:44.752 "uuid": "bb35dd1f-ceb6-4ec3-b5b3-0c446229e96b", 00:21:44.752 "is_configured": true, 00:21:44.752 "data_offset": 2048, 00:21:44.752 "data_size": 63488 00:21:44.752 }, 00:21:44.752 { 00:21:44.752 "name": "BaseBdev4", 00:21:44.752 "uuid": "7a1058c0-befb-4394-bd35-d9c69f08bdde", 00:21:44.752 "is_configured": true, 00:21:44.752 "data_offset": 2048, 00:21:44.752 "data_size": 63488 00:21:44.752 } 00:21:44.752 ] 00:21:44.752 } 00:21:44.752 } 00:21:44.752 }' 00:21:44.752 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:44.752 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:44.752 BaseBdev2 00:21:44.752 BaseBdev3 00:21:44.752 BaseBdev4' 00:21:44.752 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:44.752 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:44.752 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:45.317 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:45.317 "name": "BaseBdev1", 00:21:45.317 "aliases": [ 00:21:45.317 "647c5e20-cf67-496a-b2f8-47a5f2716d7e" 00:21:45.317 ], 00:21:45.317 "product_name": "Malloc disk", 00:21:45.317 "block_size": 512, 00:21:45.317 "num_blocks": 65536, 00:21:45.317 "uuid": "647c5e20-cf67-496a-b2f8-47a5f2716d7e", 00:21:45.317 "assigned_rate_limits": { 00:21:45.317 "rw_ios_per_sec": 0, 00:21:45.317 "rw_mbytes_per_sec": 0, 00:21:45.317 "r_mbytes_per_sec": 0, 00:21:45.317 "w_mbytes_per_sec": 0 00:21:45.317 }, 00:21:45.317 "claimed": true, 00:21:45.317 "claim_type": "exclusive_write", 00:21:45.317 "zoned": false, 00:21:45.317 "supported_io_types": { 00:21:45.317 "read": true, 00:21:45.317 "write": true, 00:21:45.317 "unmap": true, 00:21:45.317 "flush": true, 00:21:45.317 "reset": true, 00:21:45.317 "nvme_admin": false, 00:21:45.317 "nvme_io": false, 00:21:45.317 "nvme_io_md": false, 00:21:45.317 "write_zeroes": true, 00:21:45.317 "zcopy": true, 00:21:45.317 "get_zone_info": false, 00:21:45.317 "zone_management": false, 00:21:45.317 "zone_append": false, 00:21:45.317 "compare": false, 00:21:45.317 "compare_and_write": false, 00:21:45.317 "abort": true, 00:21:45.317 "seek_hole": false, 00:21:45.317 "seek_data": false, 00:21:45.317 "copy": true, 00:21:45.317 "nvme_iov_md": false 00:21:45.317 }, 00:21:45.317 "memory_domains": [ 00:21:45.317 { 00:21:45.317 "dma_device_id": "system", 00:21:45.317 "dma_device_type": 1 00:21:45.317 }, 00:21:45.317 { 00:21:45.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.317 "dma_device_type": 2 00:21:45.317 } 00:21:45.317 ], 00:21:45.317 "driver_specific": {} 00:21:45.317 }' 00:21:45.317 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.317 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.575 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:45.575 13:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.575 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.575 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:45.575 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.575 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.833 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:45.833 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.833 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.833 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:45.833 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:45.833 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:45.833 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:46.092 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:46.092 "name": "BaseBdev2", 00:21:46.092 "aliases": [ 00:21:46.092 "1b1ada44-af51-4f2e-af62-b7a24012e015" 00:21:46.092 ], 00:21:46.092 "product_name": "Malloc disk", 00:21:46.092 "block_size": 512, 00:21:46.092 "num_blocks": 65536, 00:21:46.092 "uuid": "1b1ada44-af51-4f2e-af62-b7a24012e015", 00:21:46.092 "assigned_rate_limits": { 00:21:46.092 "rw_ios_per_sec": 0, 00:21:46.092 "rw_mbytes_per_sec": 0, 00:21:46.092 "r_mbytes_per_sec": 0, 00:21:46.092 "w_mbytes_per_sec": 0 00:21:46.092 }, 00:21:46.092 "claimed": true, 00:21:46.092 "claim_type": "exclusive_write", 00:21:46.092 "zoned": false, 00:21:46.092 "supported_io_types": { 00:21:46.092 "read": true, 00:21:46.092 "write": true, 00:21:46.092 "unmap": true, 00:21:46.092 "flush": true, 00:21:46.092 "reset": true, 00:21:46.092 "nvme_admin": false, 00:21:46.092 "nvme_io": false, 00:21:46.092 "nvme_io_md": false, 00:21:46.092 "write_zeroes": true, 00:21:46.092 "zcopy": true, 00:21:46.092 "get_zone_info": false, 00:21:46.092 "zone_management": false, 00:21:46.092 "zone_append": false, 00:21:46.092 "compare": false, 00:21:46.092 "compare_and_write": false, 00:21:46.092 "abort": true, 00:21:46.092 "seek_hole": false, 00:21:46.092 "seek_data": false, 00:21:46.092 "copy": true, 00:21:46.092 "nvme_iov_md": false 00:21:46.092 }, 00:21:46.092 "memory_domains": [ 00:21:46.092 { 00:21:46.092 "dma_device_id": "system", 00:21:46.092 "dma_device_type": 1 00:21:46.092 }, 00:21:46.092 { 00:21:46.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.092 "dma_device_type": 2 00:21:46.092 } 00:21:46.092 ], 00:21:46.092 "driver_specific": {} 00:21:46.092 }' 00:21:46.092 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.092 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.092 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:46.092 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.351 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.351 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:46.351 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.351 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.351 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:46.351 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.351 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.608 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:46.608 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:46.608 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:46.608 13:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:47.175 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:47.175 "name": "BaseBdev3", 00:21:47.175 "aliases": [ 00:21:47.175 "bb35dd1f-ceb6-4ec3-b5b3-0c446229e96b" 00:21:47.175 ], 00:21:47.175 "product_name": "Malloc disk", 00:21:47.175 "block_size": 512, 00:21:47.175 "num_blocks": 65536, 00:21:47.175 "uuid": "bb35dd1f-ceb6-4ec3-b5b3-0c446229e96b", 00:21:47.175 "assigned_rate_limits": { 00:21:47.175 "rw_ios_per_sec": 0, 00:21:47.175 "rw_mbytes_per_sec": 0, 00:21:47.175 "r_mbytes_per_sec": 0, 00:21:47.175 "w_mbytes_per_sec": 0 00:21:47.175 }, 00:21:47.175 "claimed": true, 00:21:47.175 "claim_type": "exclusive_write", 00:21:47.175 "zoned": false, 00:21:47.175 "supported_io_types": { 00:21:47.175 "read": true, 00:21:47.175 "write": true, 00:21:47.175 "unmap": true, 00:21:47.175 "flush": true, 00:21:47.175 "reset": true, 00:21:47.175 "nvme_admin": false, 00:21:47.175 "nvme_io": false, 00:21:47.175 "nvme_io_md": false, 00:21:47.175 "write_zeroes": true, 00:21:47.175 "zcopy": true, 00:21:47.175 "get_zone_info": false, 00:21:47.175 "zone_management": false, 00:21:47.175 "zone_append": false, 00:21:47.175 "compare": false, 00:21:47.175 "compare_and_write": false, 00:21:47.175 "abort": true, 00:21:47.175 "seek_hole": false, 00:21:47.175 "seek_data": false, 00:21:47.175 "copy": true, 00:21:47.175 "nvme_iov_md": false 00:21:47.175 }, 00:21:47.175 "memory_domains": [ 00:21:47.175 { 00:21:47.175 "dma_device_id": "system", 00:21:47.175 "dma_device_type": 1 00:21:47.175 }, 00:21:47.175 { 00:21:47.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.175 "dma_device_type": 2 00:21:47.175 } 00:21:47.176 ], 00:21:47.176 "driver_specific": {} 00:21:47.176 }' 00:21:47.176 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.176 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.176 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:47.176 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.176 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.176 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:47.176 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:47.176 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:47.434 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:47.434 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:47.434 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:47.434 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:47.434 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:47.434 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:47.434 13:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:47.693 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:47.693 "name": "BaseBdev4", 00:21:47.693 "aliases": [ 00:21:47.693 "7a1058c0-befb-4394-bd35-d9c69f08bdde" 00:21:47.693 ], 00:21:47.693 "product_name": "Malloc disk", 00:21:47.693 "block_size": 512, 00:21:47.693 "num_blocks": 65536, 00:21:47.693 "uuid": "7a1058c0-befb-4394-bd35-d9c69f08bdde", 00:21:47.693 "assigned_rate_limits": { 00:21:47.693 "rw_ios_per_sec": 0, 00:21:47.693 "rw_mbytes_per_sec": 0, 00:21:47.693 "r_mbytes_per_sec": 0, 00:21:47.693 "w_mbytes_per_sec": 0 00:21:47.693 }, 00:21:47.693 "claimed": true, 00:21:47.693 "claim_type": "exclusive_write", 00:21:47.693 "zoned": false, 00:21:47.693 "supported_io_types": { 00:21:47.693 "read": true, 00:21:47.693 "write": true, 00:21:47.693 "unmap": true, 00:21:47.693 "flush": true, 00:21:47.693 "reset": true, 00:21:47.693 "nvme_admin": false, 00:21:47.693 "nvme_io": false, 00:21:47.693 "nvme_io_md": false, 00:21:47.693 "write_zeroes": true, 00:21:47.693 "zcopy": true, 00:21:47.693 "get_zone_info": false, 00:21:47.693 "zone_management": false, 00:21:47.693 "zone_append": false, 00:21:47.693 "compare": false, 00:21:47.693 "compare_and_write": false, 00:21:47.693 "abort": true, 00:21:47.693 "seek_hole": false, 00:21:47.693 "seek_data": false, 00:21:47.693 "copy": true, 00:21:47.693 "nvme_iov_md": false 00:21:47.693 }, 00:21:47.693 "memory_domains": [ 00:21:47.693 { 00:21:47.693 "dma_device_id": "system", 00:21:47.693 "dma_device_type": 1 00:21:47.693 }, 00:21:47.693 { 00:21:47.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.693 "dma_device_type": 2 00:21:47.693 } 00:21:47.693 ], 00:21:47.693 "driver_specific": {} 00:21:47.693 }' 00:21:47.693 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.693 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.951 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:47.951 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.951 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.951 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:47.951 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:47.951 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.209 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:48.209 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.209 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.209 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:48.209 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:48.467 [2024-07-12 13:47:36.891644] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.467 13:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:48.726 13:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.726 "name": "Existed_Raid", 00:21:48.726 "uuid": "b76532d6-5901-41cb-a044-4fba5e9871d6", 00:21:48.726 "strip_size_kb": 0, 00:21:48.726 "state": "online", 00:21:48.726 "raid_level": "raid1", 00:21:48.726 "superblock": true, 00:21:48.726 "num_base_bdevs": 4, 00:21:48.726 "num_base_bdevs_discovered": 3, 00:21:48.726 "num_base_bdevs_operational": 3, 00:21:48.726 "base_bdevs_list": [ 00:21:48.726 { 00:21:48.726 "name": null, 00:21:48.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.726 "is_configured": false, 00:21:48.726 "data_offset": 2048, 00:21:48.726 "data_size": 63488 00:21:48.726 }, 00:21:48.726 { 00:21:48.726 "name": "BaseBdev2", 00:21:48.726 "uuid": "1b1ada44-af51-4f2e-af62-b7a24012e015", 00:21:48.726 "is_configured": true, 00:21:48.726 "data_offset": 2048, 00:21:48.726 "data_size": 63488 00:21:48.726 }, 00:21:48.726 { 00:21:48.726 "name": "BaseBdev3", 00:21:48.726 "uuid": "bb35dd1f-ceb6-4ec3-b5b3-0c446229e96b", 00:21:48.726 "is_configured": true, 00:21:48.726 "data_offset": 2048, 00:21:48.726 "data_size": 63488 00:21:48.726 }, 00:21:48.726 { 00:21:48.726 "name": "BaseBdev4", 00:21:48.726 "uuid": "7a1058c0-befb-4394-bd35-d9c69f08bdde", 00:21:48.726 "is_configured": true, 00:21:48.726 "data_offset": 2048, 00:21:48.726 "data_size": 63488 00:21:48.726 } 00:21:48.726 ] 00:21:48.726 }' 00:21:48.726 13:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.726 13:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:49.661 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:49.661 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:49.661 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.661 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:49.661 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:49.661 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:49.661 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:49.920 [2024-07-12 13:47:38.409654] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:49.920 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:49.920 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:49.920 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.920 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:50.487 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:50.487 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:50.487 13:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:50.745 [2024-07-12 13:47:39.186335] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:50.745 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:50.745 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:50.745 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.745 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:51.004 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:51.004 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:51.004 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:51.262 [2024-07-12 13:47:39.692045] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:51.262 [2024-07-12 13:47:39.692133] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:51.262 [2024-07-12 13:47:39.704783] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:51.262 [2024-07-12 13:47:39.704826] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:51.262 [2024-07-12 13:47:39.704837] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ea2c40 name Existed_Raid, state offline 00:21:51.262 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:51.262 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:51.262 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.262 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:51.522 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:51.522 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:51.522 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:51.522 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:51.522 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:51.522 13:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:51.780 BaseBdev2 00:21:51.780 13:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:51.780 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:51.780 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:51.780 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:51.780 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:51.780 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:51.780 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:52.040 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:52.298 [ 00:21:52.298 { 00:21:52.298 "name": "BaseBdev2", 00:21:52.298 "aliases": [ 00:21:52.298 "a316fc17-22cc-457c-a03d-ec2fb51084d4" 00:21:52.298 ], 00:21:52.298 "product_name": "Malloc disk", 00:21:52.298 "block_size": 512, 00:21:52.298 "num_blocks": 65536, 00:21:52.298 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:21:52.298 "assigned_rate_limits": { 00:21:52.298 "rw_ios_per_sec": 0, 00:21:52.298 "rw_mbytes_per_sec": 0, 00:21:52.298 "r_mbytes_per_sec": 0, 00:21:52.298 "w_mbytes_per_sec": 0 00:21:52.298 }, 00:21:52.298 "claimed": false, 00:21:52.298 "zoned": false, 00:21:52.298 "supported_io_types": { 00:21:52.298 "read": true, 00:21:52.298 "write": true, 00:21:52.298 "unmap": true, 00:21:52.298 "flush": true, 00:21:52.298 "reset": true, 00:21:52.298 "nvme_admin": false, 00:21:52.298 "nvme_io": false, 00:21:52.298 "nvme_io_md": false, 00:21:52.298 "write_zeroes": true, 00:21:52.298 "zcopy": true, 00:21:52.298 "get_zone_info": false, 00:21:52.298 "zone_management": false, 00:21:52.298 "zone_append": false, 00:21:52.298 "compare": false, 00:21:52.298 "compare_and_write": false, 00:21:52.298 "abort": true, 00:21:52.298 "seek_hole": false, 00:21:52.298 "seek_data": false, 00:21:52.298 "copy": true, 00:21:52.298 "nvme_iov_md": false 00:21:52.298 }, 00:21:52.298 "memory_domains": [ 00:21:52.298 { 00:21:52.298 "dma_device_id": "system", 00:21:52.298 "dma_device_type": 1 00:21:52.298 }, 00:21:52.298 { 00:21:52.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.298 "dma_device_type": 2 00:21:52.298 } 00:21:52.298 ], 00:21:52.298 "driver_specific": {} 00:21:52.298 } 00:21:52.298 ] 00:21:52.298 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:52.298 13:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:52.298 13:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:52.298 13:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:52.557 BaseBdev3 00:21:52.557 13:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:52.557 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:52.557 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:52.557 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:52.557 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:52.557 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:52.557 13:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:52.838 13:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:52.838 [ 00:21:52.838 { 00:21:52.838 "name": "BaseBdev3", 00:21:52.838 "aliases": [ 00:21:52.838 "214a8b00-b58f-447d-a601-770965bee0f5" 00:21:52.838 ], 00:21:52.838 "product_name": "Malloc disk", 00:21:52.838 "block_size": 512, 00:21:52.838 "num_blocks": 65536, 00:21:52.838 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:21:52.838 "assigned_rate_limits": { 00:21:52.838 "rw_ios_per_sec": 0, 00:21:52.838 "rw_mbytes_per_sec": 0, 00:21:52.838 "r_mbytes_per_sec": 0, 00:21:52.838 "w_mbytes_per_sec": 0 00:21:52.838 }, 00:21:52.838 "claimed": false, 00:21:52.838 "zoned": false, 00:21:52.838 "supported_io_types": { 00:21:52.838 "read": true, 00:21:52.838 "write": true, 00:21:52.838 "unmap": true, 00:21:52.838 "flush": true, 00:21:52.838 "reset": true, 00:21:52.838 "nvme_admin": false, 00:21:52.838 "nvme_io": false, 00:21:52.838 "nvme_io_md": false, 00:21:52.838 "write_zeroes": true, 00:21:52.838 "zcopy": true, 00:21:52.838 "get_zone_info": false, 00:21:52.838 "zone_management": false, 00:21:52.838 "zone_append": false, 00:21:52.838 "compare": false, 00:21:52.838 "compare_and_write": false, 00:21:52.838 "abort": true, 00:21:52.838 "seek_hole": false, 00:21:52.838 "seek_data": false, 00:21:52.838 "copy": true, 00:21:52.838 "nvme_iov_md": false 00:21:52.838 }, 00:21:52.838 "memory_domains": [ 00:21:52.838 { 00:21:52.838 "dma_device_id": "system", 00:21:52.839 "dma_device_type": 1 00:21:52.839 }, 00:21:52.839 { 00:21:52.839 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.839 "dma_device_type": 2 00:21:52.839 } 00:21:52.839 ], 00:21:52.839 "driver_specific": {} 00:21:52.839 } 00:21:52.839 ] 00:21:52.839 13:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:52.839 13:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:52.839 13:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:52.839 13:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:53.097 BaseBdev4 00:21:53.355 13:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:53.355 13:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:53.355 13:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:53.355 13:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:53.355 13:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:53.355 13:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:53.355 13:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:53.355 13:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:53.614 [ 00:21:53.614 { 00:21:53.614 "name": "BaseBdev4", 00:21:53.614 "aliases": [ 00:21:53.614 "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32" 00:21:53.614 ], 00:21:53.614 "product_name": "Malloc disk", 00:21:53.614 "block_size": 512, 00:21:53.614 "num_blocks": 65536, 00:21:53.614 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:21:53.614 "assigned_rate_limits": { 00:21:53.614 "rw_ios_per_sec": 0, 00:21:53.614 "rw_mbytes_per_sec": 0, 00:21:53.614 "r_mbytes_per_sec": 0, 00:21:53.614 "w_mbytes_per_sec": 0 00:21:53.614 }, 00:21:53.614 "claimed": false, 00:21:53.614 "zoned": false, 00:21:53.614 "supported_io_types": { 00:21:53.614 "read": true, 00:21:53.614 "write": true, 00:21:53.614 "unmap": true, 00:21:53.614 "flush": true, 00:21:53.614 "reset": true, 00:21:53.614 "nvme_admin": false, 00:21:53.614 "nvme_io": false, 00:21:53.614 "nvme_io_md": false, 00:21:53.614 "write_zeroes": true, 00:21:53.614 "zcopy": true, 00:21:53.614 "get_zone_info": false, 00:21:53.614 "zone_management": false, 00:21:53.614 "zone_append": false, 00:21:53.614 "compare": false, 00:21:53.614 "compare_and_write": false, 00:21:53.614 "abort": true, 00:21:53.614 "seek_hole": false, 00:21:53.614 "seek_data": false, 00:21:53.614 "copy": true, 00:21:53.614 "nvme_iov_md": false 00:21:53.614 }, 00:21:53.614 "memory_domains": [ 00:21:53.614 { 00:21:53.614 "dma_device_id": "system", 00:21:53.614 "dma_device_type": 1 00:21:53.614 }, 00:21:53.614 { 00:21:53.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.614 "dma_device_type": 2 00:21:53.614 } 00:21:53.614 ], 00:21:53.614 "driver_specific": {} 00:21:53.614 } 00:21:53.614 ] 00:21:53.614 13:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:53.614 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:53.614 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:53.614 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:53.873 [2024-07-12 13:47:42.381035] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:53.873 [2024-07-12 13:47:42.381077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:53.873 [2024-07-12 13:47:42.381095] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:53.873 [2024-07-12 13:47:42.382476] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:53.873 [2024-07-12 13:47:42.382519] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.873 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:54.132 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.132 "name": "Existed_Raid", 00:21:54.132 "uuid": "855b0bbb-ac27-4f53-98e3-df5f0c442ee7", 00:21:54.132 "strip_size_kb": 0, 00:21:54.132 "state": "configuring", 00:21:54.132 "raid_level": "raid1", 00:21:54.132 "superblock": true, 00:21:54.132 "num_base_bdevs": 4, 00:21:54.132 "num_base_bdevs_discovered": 3, 00:21:54.132 "num_base_bdevs_operational": 4, 00:21:54.132 "base_bdevs_list": [ 00:21:54.132 { 00:21:54.132 "name": "BaseBdev1", 00:21:54.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.132 "is_configured": false, 00:21:54.132 "data_offset": 0, 00:21:54.132 "data_size": 0 00:21:54.132 }, 00:21:54.132 { 00:21:54.132 "name": "BaseBdev2", 00:21:54.132 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:21:54.132 "is_configured": true, 00:21:54.132 "data_offset": 2048, 00:21:54.132 "data_size": 63488 00:21:54.132 }, 00:21:54.132 { 00:21:54.132 "name": "BaseBdev3", 00:21:54.132 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:21:54.132 "is_configured": true, 00:21:54.132 "data_offset": 2048, 00:21:54.132 "data_size": 63488 00:21:54.132 }, 00:21:54.132 { 00:21:54.132 "name": "BaseBdev4", 00:21:54.132 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:21:54.132 "is_configured": true, 00:21:54.132 "data_offset": 2048, 00:21:54.132 "data_size": 63488 00:21:54.132 } 00:21:54.132 ] 00:21:54.132 }' 00:21:54.132 13:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.132 13:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:54.698 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:55.264 [2024-07-12 13:47:43.756644] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.264 13:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:55.830 13:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:55.830 "name": "Existed_Raid", 00:21:55.830 "uuid": "855b0bbb-ac27-4f53-98e3-df5f0c442ee7", 00:21:55.830 "strip_size_kb": 0, 00:21:55.830 "state": "configuring", 00:21:55.830 "raid_level": "raid1", 00:21:55.830 "superblock": true, 00:21:55.830 "num_base_bdevs": 4, 00:21:55.830 "num_base_bdevs_discovered": 2, 00:21:55.830 "num_base_bdevs_operational": 4, 00:21:55.830 "base_bdevs_list": [ 00:21:55.830 { 00:21:55.830 "name": "BaseBdev1", 00:21:55.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:55.830 "is_configured": false, 00:21:55.830 "data_offset": 0, 00:21:55.830 "data_size": 0 00:21:55.830 }, 00:21:55.830 { 00:21:55.830 "name": null, 00:21:55.830 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:21:55.830 "is_configured": false, 00:21:55.830 "data_offset": 2048, 00:21:55.830 "data_size": 63488 00:21:55.830 }, 00:21:55.830 { 00:21:55.830 "name": "BaseBdev3", 00:21:55.830 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:21:55.830 "is_configured": true, 00:21:55.830 "data_offset": 2048, 00:21:55.830 "data_size": 63488 00:21:55.830 }, 00:21:55.830 { 00:21:55.830 "name": "BaseBdev4", 00:21:55.830 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:21:55.830 "is_configured": true, 00:21:55.830 "data_offset": 2048, 00:21:55.830 "data_size": 63488 00:21:55.830 } 00:21:55.830 ] 00:21:55.830 }' 00:21:55.830 13:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:55.830 13:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:56.398 13:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.398 13:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:56.966 13:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:56.966 13:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:57.225 [2024-07-12 13:47:45.710373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:57.225 BaseBdev1 00:21:57.225 13:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:57.225 13:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:57.225 13:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:57.225 13:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:57.225 13:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:57.225 13:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:57.225 13:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:57.485 13:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:57.744 [ 00:21:57.744 { 00:21:57.744 "name": "BaseBdev1", 00:21:57.744 "aliases": [ 00:21:57.744 "aefa846a-aac4-4f6d-9166-261bc262a7dd" 00:21:57.744 ], 00:21:57.744 "product_name": "Malloc disk", 00:21:57.744 "block_size": 512, 00:21:57.744 "num_blocks": 65536, 00:21:57.744 "uuid": "aefa846a-aac4-4f6d-9166-261bc262a7dd", 00:21:57.744 "assigned_rate_limits": { 00:21:57.744 "rw_ios_per_sec": 0, 00:21:57.744 "rw_mbytes_per_sec": 0, 00:21:57.744 "r_mbytes_per_sec": 0, 00:21:57.744 "w_mbytes_per_sec": 0 00:21:57.744 }, 00:21:57.744 "claimed": true, 00:21:57.744 "claim_type": "exclusive_write", 00:21:57.744 "zoned": false, 00:21:57.744 "supported_io_types": { 00:21:57.744 "read": true, 00:21:57.744 "write": true, 00:21:57.744 "unmap": true, 00:21:57.744 "flush": true, 00:21:57.744 "reset": true, 00:21:57.744 "nvme_admin": false, 00:21:57.744 "nvme_io": false, 00:21:57.744 "nvme_io_md": false, 00:21:57.744 "write_zeroes": true, 00:21:57.744 "zcopy": true, 00:21:57.744 "get_zone_info": false, 00:21:57.745 "zone_management": false, 00:21:57.745 "zone_append": false, 00:21:57.745 "compare": false, 00:21:57.745 "compare_and_write": false, 00:21:57.745 "abort": true, 00:21:57.745 "seek_hole": false, 00:21:57.745 "seek_data": false, 00:21:57.745 "copy": true, 00:21:57.745 "nvme_iov_md": false 00:21:57.745 }, 00:21:57.745 "memory_domains": [ 00:21:57.745 { 00:21:57.745 "dma_device_id": "system", 00:21:57.745 "dma_device_type": 1 00:21:57.745 }, 00:21:57.745 { 00:21:57.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:57.745 "dma_device_type": 2 00:21:57.745 } 00:21:57.745 ], 00:21:57.745 "driver_specific": {} 00:21:57.745 } 00:21:57.745 ] 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.745 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:58.004 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.004 "name": "Existed_Raid", 00:21:58.004 "uuid": "855b0bbb-ac27-4f53-98e3-df5f0c442ee7", 00:21:58.004 "strip_size_kb": 0, 00:21:58.004 "state": "configuring", 00:21:58.004 "raid_level": "raid1", 00:21:58.004 "superblock": true, 00:21:58.004 "num_base_bdevs": 4, 00:21:58.004 "num_base_bdevs_discovered": 3, 00:21:58.004 "num_base_bdevs_operational": 4, 00:21:58.004 "base_bdevs_list": [ 00:21:58.004 { 00:21:58.004 "name": "BaseBdev1", 00:21:58.004 "uuid": "aefa846a-aac4-4f6d-9166-261bc262a7dd", 00:21:58.004 "is_configured": true, 00:21:58.004 "data_offset": 2048, 00:21:58.004 "data_size": 63488 00:21:58.004 }, 00:21:58.004 { 00:21:58.004 "name": null, 00:21:58.004 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:21:58.004 "is_configured": false, 00:21:58.004 "data_offset": 2048, 00:21:58.004 "data_size": 63488 00:21:58.004 }, 00:21:58.004 { 00:21:58.004 "name": "BaseBdev3", 00:21:58.004 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:21:58.004 "is_configured": true, 00:21:58.004 "data_offset": 2048, 00:21:58.004 "data_size": 63488 00:21:58.004 }, 00:21:58.004 { 00:21:58.004 "name": "BaseBdev4", 00:21:58.004 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:21:58.004 "is_configured": true, 00:21:58.004 "data_offset": 2048, 00:21:58.004 "data_size": 63488 00:21:58.004 } 00:21:58.004 ] 00:21:58.004 }' 00:21:58.004 13:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.004 13:47:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:58.570 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.570 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:58.829 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:58.829 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:59.088 [2024-07-12 13:47:47.591426] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.088 13:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:59.656 13:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.656 "name": "Existed_Raid", 00:21:59.656 "uuid": "855b0bbb-ac27-4f53-98e3-df5f0c442ee7", 00:21:59.656 "strip_size_kb": 0, 00:21:59.656 "state": "configuring", 00:21:59.656 "raid_level": "raid1", 00:21:59.656 "superblock": true, 00:21:59.656 "num_base_bdevs": 4, 00:21:59.656 "num_base_bdevs_discovered": 2, 00:21:59.656 "num_base_bdevs_operational": 4, 00:21:59.656 "base_bdevs_list": [ 00:21:59.656 { 00:21:59.656 "name": "BaseBdev1", 00:21:59.656 "uuid": "aefa846a-aac4-4f6d-9166-261bc262a7dd", 00:21:59.656 "is_configured": true, 00:21:59.656 "data_offset": 2048, 00:21:59.656 "data_size": 63488 00:21:59.656 }, 00:21:59.656 { 00:21:59.656 "name": null, 00:21:59.656 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:21:59.656 "is_configured": false, 00:21:59.656 "data_offset": 2048, 00:21:59.656 "data_size": 63488 00:21:59.656 }, 00:21:59.656 { 00:21:59.656 "name": null, 00:21:59.656 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:21:59.656 "is_configured": false, 00:21:59.656 "data_offset": 2048, 00:21:59.656 "data_size": 63488 00:21:59.656 }, 00:21:59.656 { 00:21:59.656 "name": "BaseBdev4", 00:21:59.656 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:21:59.656 "is_configured": true, 00:21:59.656 "data_offset": 2048, 00:21:59.656 "data_size": 63488 00:21:59.656 } 00:21:59.656 ] 00:21:59.656 }' 00:21:59.656 13:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.656 13:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:00.222 13:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.222 13:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:00.480 13:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:00.480 13:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:00.739 [2024-07-12 13:47:49.187675] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.739 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:00.998 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.998 "name": "Existed_Raid", 00:22:00.998 "uuid": "855b0bbb-ac27-4f53-98e3-df5f0c442ee7", 00:22:00.998 "strip_size_kb": 0, 00:22:00.998 "state": "configuring", 00:22:00.998 "raid_level": "raid1", 00:22:00.998 "superblock": true, 00:22:00.998 "num_base_bdevs": 4, 00:22:00.998 "num_base_bdevs_discovered": 3, 00:22:00.998 "num_base_bdevs_operational": 4, 00:22:00.998 "base_bdevs_list": [ 00:22:00.998 { 00:22:00.998 "name": "BaseBdev1", 00:22:00.998 "uuid": "aefa846a-aac4-4f6d-9166-261bc262a7dd", 00:22:00.998 "is_configured": true, 00:22:00.998 "data_offset": 2048, 00:22:00.998 "data_size": 63488 00:22:00.998 }, 00:22:00.998 { 00:22:00.998 "name": null, 00:22:00.998 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:22:00.998 "is_configured": false, 00:22:00.998 "data_offset": 2048, 00:22:00.998 "data_size": 63488 00:22:00.998 }, 00:22:00.998 { 00:22:00.998 "name": "BaseBdev3", 00:22:00.998 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:22:00.998 "is_configured": true, 00:22:00.998 "data_offset": 2048, 00:22:00.998 "data_size": 63488 00:22:00.998 }, 00:22:00.998 { 00:22:00.998 "name": "BaseBdev4", 00:22:00.998 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:22:00.998 "is_configured": true, 00:22:00.998 "data_offset": 2048, 00:22:00.998 "data_size": 63488 00:22:00.998 } 00:22:00.998 ] 00:22:00.998 }' 00:22:00.998 13:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.998 13:47:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:01.565 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.565 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:01.824 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:01.824 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:02.083 [2024-07-12 13:47:50.535271] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.083 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:02.341 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:02.341 "name": "Existed_Raid", 00:22:02.341 "uuid": "855b0bbb-ac27-4f53-98e3-df5f0c442ee7", 00:22:02.341 "strip_size_kb": 0, 00:22:02.341 "state": "configuring", 00:22:02.341 "raid_level": "raid1", 00:22:02.341 "superblock": true, 00:22:02.341 "num_base_bdevs": 4, 00:22:02.341 "num_base_bdevs_discovered": 2, 00:22:02.341 "num_base_bdevs_operational": 4, 00:22:02.341 "base_bdevs_list": [ 00:22:02.341 { 00:22:02.341 "name": null, 00:22:02.341 "uuid": "aefa846a-aac4-4f6d-9166-261bc262a7dd", 00:22:02.341 "is_configured": false, 00:22:02.341 "data_offset": 2048, 00:22:02.341 "data_size": 63488 00:22:02.341 }, 00:22:02.341 { 00:22:02.341 "name": null, 00:22:02.341 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:22:02.341 "is_configured": false, 00:22:02.341 "data_offset": 2048, 00:22:02.341 "data_size": 63488 00:22:02.341 }, 00:22:02.341 { 00:22:02.341 "name": "BaseBdev3", 00:22:02.341 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:22:02.341 "is_configured": true, 00:22:02.341 "data_offset": 2048, 00:22:02.341 "data_size": 63488 00:22:02.341 }, 00:22:02.341 { 00:22:02.341 "name": "BaseBdev4", 00:22:02.341 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:22:02.341 "is_configured": true, 00:22:02.341 "data_offset": 2048, 00:22:02.341 "data_size": 63488 00:22:02.341 } 00:22:02.341 ] 00:22:02.341 }' 00:22:02.341 13:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:02.341 13:47:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:02.908 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.908 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:03.167 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:03.167 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:03.426 [2024-07-12 13:47:51.827143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.426 13:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:03.685 13:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:03.685 "name": "Existed_Raid", 00:22:03.685 "uuid": "855b0bbb-ac27-4f53-98e3-df5f0c442ee7", 00:22:03.685 "strip_size_kb": 0, 00:22:03.685 "state": "configuring", 00:22:03.685 "raid_level": "raid1", 00:22:03.685 "superblock": true, 00:22:03.685 "num_base_bdevs": 4, 00:22:03.685 "num_base_bdevs_discovered": 3, 00:22:03.685 "num_base_bdevs_operational": 4, 00:22:03.685 "base_bdevs_list": [ 00:22:03.685 { 00:22:03.685 "name": null, 00:22:03.685 "uuid": "aefa846a-aac4-4f6d-9166-261bc262a7dd", 00:22:03.685 "is_configured": false, 00:22:03.685 "data_offset": 2048, 00:22:03.685 "data_size": 63488 00:22:03.685 }, 00:22:03.685 { 00:22:03.685 "name": "BaseBdev2", 00:22:03.685 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:22:03.685 "is_configured": true, 00:22:03.685 "data_offset": 2048, 00:22:03.685 "data_size": 63488 00:22:03.685 }, 00:22:03.685 { 00:22:03.685 "name": "BaseBdev3", 00:22:03.685 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:22:03.685 "is_configured": true, 00:22:03.685 "data_offset": 2048, 00:22:03.685 "data_size": 63488 00:22:03.685 }, 00:22:03.685 { 00:22:03.685 "name": "BaseBdev4", 00:22:03.685 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:22:03.685 "is_configured": true, 00:22:03.685 "data_offset": 2048, 00:22:03.685 "data_size": 63488 00:22:03.685 } 00:22:03.685 ] 00:22:03.685 }' 00:22:03.685 13:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:03.685 13:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:04.251 13:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.251 13:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:04.509 13:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:04.509 13:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.509 13:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:04.766 13:47:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u aefa846a-aac4-4f6d-9166-261bc262a7dd 00:22:05.332 [2024-07-12 13:47:53.695432] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:05.332 [2024-07-12 13:47:53.695584] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ea4d90 00:22:05.332 [2024-07-12 13:47:53.695597] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:05.332 [2024-07-12 13:47:53.695771] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dbccd0 00:22:05.332 [2024-07-12 13:47:53.695892] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ea4d90 00:22:05.332 [2024-07-12 13:47:53.695902] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ea4d90 00:22:05.332 [2024-07-12 13:47:53.696006] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:05.332 NewBaseBdev 00:22:05.332 13:47:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:05.332 13:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:05.332 13:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:05.332 13:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:05.332 13:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:05.332 13:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:05.332 13:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:05.900 13:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:06.158 [ 00:22:06.158 { 00:22:06.158 "name": "NewBaseBdev", 00:22:06.158 "aliases": [ 00:22:06.158 "aefa846a-aac4-4f6d-9166-261bc262a7dd" 00:22:06.158 ], 00:22:06.158 "product_name": "Malloc disk", 00:22:06.158 "block_size": 512, 00:22:06.158 "num_blocks": 65536, 00:22:06.158 "uuid": "aefa846a-aac4-4f6d-9166-261bc262a7dd", 00:22:06.158 "assigned_rate_limits": { 00:22:06.158 "rw_ios_per_sec": 0, 00:22:06.158 "rw_mbytes_per_sec": 0, 00:22:06.158 "r_mbytes_per_sec": 0, 00:22:06.158 "w_mbytes_per_sec": 0 00:22:06.158 }, 00:22:06.158 "claimed": true, 00:22:06.158 "claim_type": "exclusive_write", 00:22:06.158 "zoned": false, 00:22:06.158 "supported_io_types": { 00:22:06.158 "read": true, 00:22:06.158 "write": true, 00:22:06.158 "unmap": true, 00:22:06.158 "flush": true, 00:22:06.158 "reset": true, 00:22:06.158 "nvme_admin": false, 00:22:06.158 "nvme_io": false, 00:22:06.158 "nvme_io_md": false, 00:22:06.158 "write_zeroes": true, 00:22:06.158 "zcopy": true, 00:22:06.159 "get_zone_info": false, 00:22:06.159 "zone_management": false, 00:22:06.159 "zone_append": false, 00:22:06.159 "compare": false, 00:22:06.159 "compare_and_write": false, 00:22:06.159 "abort": true, 00:22:06.159 "seek_hole": false, 00:22:06.159 "seek_data": false, 00:22:06.159 "copy": true, 00:22:06.159 "nvme_iov_md": false 00:22:06.159 }, 00:22:06.159 "memory_domains": [ 00:22:06.159 { 00:22:06.159 "dma_device_id": "system", 00:22:06.159 "dma_device_type": 1 00:22:06.159 }, 00:22:06.159 { 00:22:06.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:06.159 "dma_device_type": 2 00:22:06.159 } 00:22:06.159 ], 00:22:06.159 "driver_specific": {} 00:22:06.159 } 00:22:06.159 ] 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.418 "name": "Existed_Raid", 00:22:06.418 "uuid": "855b0bbb-ac27-4f53-98e3-df5f0c442ee7", 00:22:06.418 "strip_size_kb": 0, 00:22:06.418 "state": "online", 00:22:06.418 "raid_level": "raid1", 00:22:06.418 "superblock": true, 00:22:06.418 "num_base_bdevs": 4, 00:22:06.418 "num_base_bdevs_discovered": 4, 00:22:06.418 "num_base_bdevs_operational": 4, 00:22:06.418 "base_bdevs_list": [ 00:22:06.418 { 00:22:06.418 "name": "NewBaseBdev", 00:22:06.418 "uuid": "aefa846a-aac4-4f6d-9166-261bc262a7dd", 00:22:06.418 "is_configured": true, 00:22:06.418 "data_offset": 2048, 00:22:06.418 "data_size": 63488 00:22:06.418 }, 00:22:06.418 { 00:22:06.418 "name": "BaseBdev2", 00:22:06.418 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:22:06.418 "is_configured": true, 00:22:06.418 "data_offset": 2048, 00:22:06.418 "data_size": 63488 00:22:06.418 }, 00:22:06.418 { 00:22:06.418 "name": "BaseBdev3", 00:22:06.418 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:22:06.418 "is_configured": true, 00:22:06.418 "data_offset": 2048, 00:22:06.418 "data_size": 63488 00:22:06.418 }, 00:22:06.418 { 00:22:06.418 "name": "BaseBdev4", 00:22:06.418 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:22:06.418 "is_configured": true, 00:22:06.418 "data_offset": 2048, 00:22:06.418 "data_size": 63488 00:22:06.418 } 00:22:06.418 ] 00:22:06.418 }' 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.418 13:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:07.356 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:07.356 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:07.356 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:07.356 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:07.356 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:07.356 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:07.356 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:07.356 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:07.356 [2024-07-12 13:47:55.805353] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:07.356 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:07.357 "name": "Existed_Raid", 00:22:07.357 "aliases": [ 00:22:07.357 "855b0bbb-ac27-4f53-98e3-df5f0c442ee7" 00:22:07.357 ], 00:22:07.357 "product_name": "Raid Volume", 00:22:07.357 "block_size": 512, 00:22:07.357 "num_blocks": 63488, 00:22:07.357 "uuid": "855b0bbb-ac27-4f53-98e3-df5f0c442ee7", 00:22:07.357 "assigned_rate_limits": { 00:22:07.357 "rw_ios_per_sec": 0, 00:22:07.357 "rw_mbytes_per_sec": 0, 00:22:07.357 "r_mbytes_per_sec": 0, 00:22:07.357 "w_mbytes_per_sec": 0 00:22:07.357 }, 00:22:07.357 "claimed": false, 00:22:07.357 "zoned": false, 00:22:07.357 "supported_io_types": { 00:22:07.357 "read": true, 00:22:07.357 "write": true, 00:22:07.357 "unmap": false, 00:22:07.357 "flush": false, 00:22:07.357 "reset": true, 00:22:07.357 "nvme_admin": false, 00:22:07.357 "nvme_io": false, 00:22:07.357 "nvme_io_md": false, 00:22:07.357 "write_zeroes": true, 00:22:07.357 "zcopy": false, 00:22:07.357 "get_zone_info": false, 00:22:07.357 "zone_management": false, 00:22:07.357 "zone_append": false, 00:22:07.357 "compare": false, 00:22:07.357 "compare_and_write": false, 00:22:07.357 "abort": false, 00:22:07.357 "seek_hole": false, 00:22:07.357 "seek_data": false, 00:22:07.357 "copy": false, 00:22:07.357 "nvme_iov_md": false 00:22:07.357 }, 00:22:07.357 "memory_domains": [ 00:22:07.357 { 00:22:07.357 "dma_device_id": "system", 00:22:07.357 "dma_device_type": 1 00:22:07.357 }, 00:22:07.357 { 00:22:07.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:07.357 "dma_device_type": 2 00:22:07.357 }, 00:22:07.357 { 00:22:07.357 "dma_device_id": "system", 00:22:07.357 "dma_device_type": 1 00:22:07.357 }, 00:22:07.357 { 00:22:07.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:07.357 "dma_device_type": 2 00:22:07.357 }, 00:22:07.357 { 00:22:07.357 "dma_device_id": "system", 00:22:07.357 "dma_device_type": 1 00:22:07.357 }, 00:22:07.357 { 00:22:07.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:07.357 "dma_device_type": 2 00:22:07.357 }, 00:22:07.357 { 00:22:07.357 "dma_device_id": "system", 00:22:07.357 "dma_device_type": 1 00:22:07.357 }, 00:22:07.357 { 00:22:07.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:07.357 "dma_device_type": 2 00:22:07.357 } 00:22:07.357 ], 00:22:07.357 "driver_specific": { 00:22:07.357 "raid": { 00:22:07.357 "uuid": "855b0bbb-ac27-4f53-98e3-df5f0c442ee7", 00:22:07.357 "strip_size_kb": 0, 00:22:07.357 "state": "online", 00:22:07.357 "raid_level": "raid1", 00:22:07.357 "superblock": true, 00:22:07.357 "num_base_bdevs": 4, 00:22:07.357 "num_base_bdevs_discovered": 4, 00:22:07.357 "num_base_bdevs_operational": 4, 00:22:07.357 "base_bdevs_list": [ 00:22:07.357 { 00:22:07.357 "name": "NewBaseBdev", 00:22:07.357 "uuid": "aefa846a-aac4-4f6d-9166-261bc262a7dd", 00:22:07.357 "is_configured": true, 00:22:07.357 "data_offset": 2048, 00:22:07.357 "data_size": 63488 00:22:07.357 }, 00:22:07.357 { 00:22:07.357 "name": "BaseBdev2", 00:22:07.357 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:22:07.357 "is_configured": true, 00:22:07.357 "data_offset": 2048, 00:22:07.357 "data_size": 63488 00:22:07.357 }, 00:22:07.357 { 00:22:07.357 "name": "BaseBdev3", 00:22:07.357 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:22:07.357 "is_configured": true, 00:22:07.357 "data_offset": 2048, 00:22:07.357 "data_size": 63488 00:22:07.357 }, 00:22:07.357 { 00:22:07.357 "name": "BaseBdev4", 00:22:07.357 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:22:07.357 "is_configured": true, 00:22:07.357 "data_offset": 2048, 00:22:07.357 "data_size": 63488 00:22:07.357 } 00:22:07.357 ] 00:22:07.357 } 00:22:07.357 } 00:22:07.357 }' 00:22:07.357 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:07.357 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:07.357 BaseBdev2 00:22:07.357 BaseBdev3 00:22:07.357 BaseBdev4' 00:22:07.357 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:07.357 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:07.357 13:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:07.616 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:07.616 "name": "NewBaseBdev", 00:22:07.616 "aliases": [ 00:22:07.616 "aefa846a-aac4-4f6d-9166-261bc262a7dd" 00:22:07.616 ], 00:22:07.616 "product_name": "Malloc disk", 00:22:07.616 "block_size": 512, 00:22:07.616 "num_blocks": 65536, 00:22:07.616 "uuid": "aefa846a-aac4-4f6d-9166-261bc262a7dd", 00:22:07.616 "assigned_rate_limits": { 00:22:07.616 "rw_ios_per_sec": 0, 00:22:07.616 "rw_mbytes_per_sec": 0, 00:22:07.616 "r_mbytes_per_sec": 0, 00:22:07.616 "w_mbytes_per_sec": 0 00:22:07.616 }, 00:22:07.616 "claimed": true, 00:22:07.616 "claim_type": "exclusive_write", 00:22:07.616 "zoned": false, 00:22:07.616 "supported_io_types": { 00:22:07.616 "read": true, 00:22:07.616 "write": true, 00:22:07.617 "unmap": true, 00:22:07.617 "flush": true, 00:22:07.617 "reset": true, 00:22:07.617 "nvme_admin": false, 00:22:07.617 "nvme_io": false, 00:22:07.617 "nvme_io_md": false, 00:22:07.617 "write_zeroes": true, 00:22:07.617 "zcopy": true, 00:22:07.617 "get_zone_info": false, 00:22:07.617 "zone_management": false, 00:22:07.617 "zone_append": false, 00:22:07.617 "compare": false, 00:22:07.617 "compare_and_write": false, 00:22:07.617 "abort": true, 00:22:07.617 "seek_hole": false, 00:22:07.617 "seek_data": false, 00:22:07.617 "copy": true, 00:22:07.617 "nvme_iov_md": false 00:22:07.617 }, 00:22:07.617 "memory_domains": [ 00:22:07.617 { 00:22:07.617 "dma_device_id": "system", 00:22:07.617 "dma_device_type": 1 00:22:07.617 }, 00:22:07.617 { 00:22:07.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:07.617 "dma_device_type": 2 00:22:07.617 } 00:22:07.617 ], 00:22:07.617 "driver_specific": {} 00:22:07.617 }' 00:22:07.617 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:07.617 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:07.876 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:07.876 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:07.876 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:07.876 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:07.876 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:07.876 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:07.876 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:07.876 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:07.876 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:08.135 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:08.135 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:08.135 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:08.135 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:08.394 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:08.394 "name": "BaseBdev2", 00:22:08.394 "aliases": [ 00:22:08.394 "a316fc17-22cc-457c-a03d-ec2fb51084d4" 00:22:08.394 ], 00:22:08.394 "product_name": "Malloc disk", 00:22:08.394 "block_size": 512, 00:22:08.394 "num_blocks": 65536, 00:22:08.394 "uuid": "a316fc17-22cc-457c-a03d-ec2fb51084d4", 00:22:08.394 "assigned_rate_limits": { 00:22:08.394 "rw_ios_per_sec": 0, 00:22:08.394 "rw_mbytes_per_sec": 0, 00:22:08.394 "r_mbytes_per_sec": 0, 00:22:08.394 "w_mbytes_per_sec": 0 00:22:08.394 }, 00:22:08.394 "claimed": true, 00:22:08.394 "claim_type": "exclusive_write", 00:22:08.394 "zoned": false, 00:22:08.394 "supported_io_types": { 00:22:08.394 "read": true, 00:22:08.394 "write": true, 00:22:08.394 "unmap": true, 00:22:08.394 "flush": true, 00:22:08.394 "reset": true, 00:22:08.394 "nvme_admin": false, 00:22:08.394 "nvme_io": false, 00:22:08.394 "nvme_io_md": false, 00:22:08.394 "write_zeroes": true, 00:22:08.394 "zcopy": true, 00:22:08.394 "get_zone_info": false, 00:22:08.394 "zone_management": false, 00:22:08.394 "zone_append": false, 00:22:08.394 "compare": false, 00:22:08.394 "compare_and_write": false, 00:22:08.394 "abort": true, 00:22:08.394 "seek_hole": false, 00:22:08.394 "seek_data": false, 00:22:08.394 "copy": true, 00:22:08.394 "nvme_iov_md": false 00:22:08.394 }, 00:22:08.394 "memory_domains": [ 00:22:08.394 { 00:22:08.394 "dma_device_id": "system", 00:22:08.394 "dma_device_type": 1 00:22:08.394 }, 00:22:08.394 { 00:22:08.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:08.394 "dma_device_type": 2 00:22:08.394 } 00:22:08.394 ], 00:22:08.394 "driver_specific": {} 00:22:08.394 }' 00:22:08.394 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:08.394 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:08.394 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:08.394 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:08.394 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:08.394 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:08.394 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:08.394 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:08.653 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:08.653 13:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:08.653 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:08.653 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:08.653 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:08.653 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:08.653 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:08.912 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:08.912 "name": "BaseBdev3", 00:22:08.912 "aliases": [ 00:22:08.912 "214a8b00-b58f-447d-a601-770965bee0f5" 00:22:08.912 ], 00:22:08.912 "product_name": "Malloc disk", 00:22:08.912 "block_size": 512, 00:22:08.912 "num_blocks": 65536, 00:22:08.912 "uuid": "214a8b00-b58f-447d-a601-770965bee0f5", 00:22:08.912 "assigned_rate_limits": { 00:22:08.912 "rw_ios_per_sec": 0, 00:22:08.912 "rw_mbytes_per_sec": 0, 00:22:08.912 "r_mbytes_per_sec": 0, 00:22:08.912 "w_mbytes_per_sec": 0 00:22:08.912 }, 00:22:08.912 "claimed": true, 00:22:08.912 "claim_type": "exclusive_write", 00:22:08.912 "zoned": false, 00:22:08.912 "supported_io_types": { 00:22:08.912 "read": true, 00:22:08.912 "write": true, 00:22:08.912 "unmap": true, 00:22:08.912 "flush": true, 00:22:08.912 "reset": true, 00:22:08.912 "nvme_admin": false, 00:22:08.912 "nvme_io": false, 00:22:08.912 "nvme_io_md": false, 00:22:08.912 "write_zeroes": true, 00:22:08.912 "zcopy": true, 00:22:08.912 "get_zone_info": false, 00:22:08.912 "zone_management": false, 00:22:08.912 "zone_append": false, 00:22:08.912 "compare": false, 00:22:08.912 "compare_and_write": false, 00:22:08.912 "abort": true, 00:22:08.912 "seek_hole": false, 00:22:08.912 "seek_data": false, 00:22:08.912 "copy": true, 00:22:08.912 "nvme_iov_md": false 00:22:08.912 }, 00:22:08.912 "memory_domains": [ 00:22:08.912 { 00:22:08.912 "dma_device_id": "system", 00:22:08.912 "dma_device_type": 1 00:22:08.912 }, 00:22:08.912 { 00:22:08.912 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:08.912 "dma_device_type": 2 00:22:08.912 } 00:22:08.912 ], 00:22:08.912 "driver_specific": {} 00:22:08.912 }' 00:22:08.912 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:08.912 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:08.912 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:08.912 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:08.912 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:09.171 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:09.171 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:09.171 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:09.171 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:09.171 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:09.171 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:09.171 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:09.171 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:09.171 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:09.171 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:09.430 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:09.430 "name": "BaseBdev4", 00:22:09.430 "aliases": [ 00:22:09.430 "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32" 00:22:09.430 ], 00:22:09.430 "product_name": "Malloc disk", 00:22:09.430 "block_size": 512, 00:22:09.430 "num_blocks": 65536, 00:22:09.430 "uuid": "7fa8fa5a-8176-4e99-93e9-f53f73ab7b32", 00:22:09.430 "assigned_rate_limits": { 00:22:09.430 "rw_ios_per_sec": 0, 00:22:09.430 "rw_mbytes_per_sec": 0, 00:22:09.430 "r_mbytes_per_sec": 0, 00:22:09.430 "w_mbytes_per_sec": 0 00:22:09.430 }, 00:22:09.430 "claimed": true, 00:22:09.430 "claim_type": "exclusive_write", 00:22:09.430 "zoned": false, 00:22:09.430 "supported_io_types": { 00:22:09.430 "read": true, 00:22:09.430 "write": true, 00:22:09.430 "unmap": true, 00:22:09.430 "flush": true, 00:22:09.430 "reset": true, 00:22:09.430 "nvme_admin": false, 00:22:09.430 "nvme_io": false, 00:22:09.430 "nvme_io_md": false, 00:22:09.430 "write_zeroes": true, 00:22:09.430 "zcopy": true, 00:22:09.430 "get_zone_info": false, 00:22:09.430 "zone_management": false, 00:22:09.430 "zone_append": false, 00:22:09.430 "compare": false, 00:22:09.430 "compare_and_write": false, 00:22:09.430 "abort": true, 00:22:09.430 "seek_hole": false, 00:22:09.430 "seek_data": false, 00:22:09.430 "copy": true, 00:22:09.430 "nvme_iov_md": false 00:22:09.430 }, 00:22:09.430 "memory_domains": [ 00:22:09.430 { 00:22:09.430 "dma_device_id": "system", 00:22:09.430 "dma_device_type": 1 00:22:09.431 }, 00:22:09.431 { 00:22:09.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:09.431 "dma_device_type": 2 00:22:09.431 } 00:22:09.431 ], 00:22:09.431 "driver_specific": {} 00:22:09.431 }' 00:22:09.431 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:09.431 13:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:09.689 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:09.689 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:09.689 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:09.689 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:09.689 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:09.689 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:09.689 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:09.689 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:09.689 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:09.949 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:09.949 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:09.949 [2024-07-12 13:47:58.508232] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:09.949 [2024-07-12 13:47:58.508258] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:09.949 [2024-07-12 13:47:58.508308] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:09.949 [2024-07-12 13:47:58.508588] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:09.949 [2024-07-12 13:47:58.508600] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ea4d90 name Existed_Raid, state offline 00:22:09.949 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 525947 00:22:09.949 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 525947 ']' 00:22:09.949 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 525947 00:22:09.949 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:22:10.209 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:10.209 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 525947 00:22:10.209 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:10.209 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:10.209 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 525947' 00:22:10.209 killing process with pid 525947 00:22:10.209 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 525947 00:22:10.209 [2024-07-12 13:47:58.574314] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:10.209 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 525947 00:22:10.209 [2024-07-12 13:47:58.610069] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:10.469 13:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:22:10.469 00:22:10.469 real 0m35.571s 00:22:10.469 user 1m5.730s 00:22:10.469 sys 0m6.044s 00:22:10.469 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:10.469 13:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:10.469 ************************************ 00:22:10.469 END TEST raid_state_function_test_sb 00:22:10.469 ************************************ 00:22:10.469 13:47:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:10.469 13:47:58 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:22:10.469 13:47:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:10.469 13:47:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:10.469 13:47:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:10.469 ************************************ 00:22:10.469 START TEST raid_superblock_test 00:22:10.469 ************************************ 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=531178 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 531178 /var/tmp/spdk-raid.sock 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 531178 ']' 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:10.469 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:10.469 13:47:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.469 [2024-07-12 13:47:58.960499] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:22:10.469 [2024-07-12 13:47:58.960564] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid531178 ] 00:22:10.728 [2024-07-12 13:47:59.089736] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:10.728 [2024-07-12 13:47:59.193014] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:10.728 [2024-07-12 13:47:59.259202] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:10.728 [2024-07-12 13:47:59.259241] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:11.667 13:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:22:11.667 malloc1 00:22:11.667 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:11.926 [2024-07-12 13:48:00.378457] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:11.926 [2024-07-12 13:48:00.378507] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.926 [2024-07-12 13:48:00.378527] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2086e90 00:22:11.926 [2024-07-12 13:48:00.378541] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.926 [2024-07-12 13:48:00.380142] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.926 [2024-07-12 13:48:00.380169] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:11.926 pt1 00:22:11.926 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:11.926 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:11.926 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:11.926 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:11.926 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:11.926 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:11.926 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:11.926 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:11.926 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:12.186 malloc2 00:22:12.186 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:12.446 [2024-07-12 13:48:00.896585] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:12.447 [2024-07-12 13:48:00.896632] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:12.447 [2024-07-12 13:48:00.896650] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2124fb0 00:22:12.447 [2024-07-12 13:48:00.896663] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:12.447 [2024-07-12 13:48:00.898250] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:12.447 [2024-07-12 13:48:00.898277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:12.447 pt2 00:22:12.447 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:12.447 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:12.447 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:22:12.447 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:22:12.447 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:22:12.447 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:12.447 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:12.447 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:12.447 13:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:22:12.725 malloc3 00:22:12.725 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:12.983 [2024-07-12 13:48:01.399713] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:12.983 [2024-07-12 13:48:01.399760] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:12.983 [2024-07-12 13:48:01.399778] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2125ce0 00:22:12.983 [2024-07-12 13:48:01.399791] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:12.983 [2024-07-12 13:48:01.401380] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:12.983 [2024-07-12 13:48:01.401408] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:12.983 pt3 00:22:12.983 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:12.983 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:12.983 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:22:12.983 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:22:12.983 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:22:12.983 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:12.983 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:12.983 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:12.983 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:22:13.241 malloc4 00:22:13.241 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:13.498 [2024-07-12 13:48:01.886540] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:13.498 [2024-07-12 13:48:01.886588] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:13.498 [2024-07-12 13:48:01.886607] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2128450 00:22:13.498 [2024-07-12 13:48:01.886621] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:13.498 [2024-07-12 13:48:01.888151] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:13.498 [2024-07-12 13:48:01.888180] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:13.498 pt4 00:22:13.498 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:13.498 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:13.498 13:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:22:13.756 [2024-07-12 13:48:02.131208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:13.756 [2024-07-12 13:48:02.132614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:13.756 [2024-07-12 13:48:02.132671] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:13.756 [2024-07-12 13:48:02.132716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:13.756 [2024-07-12 13:48:02.132890] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2089c20 00:22:13.756 [2024-07-12 13:48:02.132906] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:13.756 [2024-07-12 13:48:02.133128] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x212a910 00:22:13.756 [2024-07-12 13:48:02.133286] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2089c20 00:22:13.756 [2024-07-12 13:48:02.133296] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2089c20 00:22:13.756 [2024-07-12 13:48:02.133400] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.756 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.014 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.014 "name": "raid_bdev1", 00:22:14.014 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:14.014 "strip_size_kb": 0, 00:22:14.014 "state": "online", 00:22:14.014 "raid_level": "raid1", 00:22:14.014 "superblock": true, 00:22:14.014 "num_base_bdevs": 4, 00:22:14.014 "num_base_bdevs_discovered": 4, 00:22:14.014 "num_base_bdevs_operational": 4, 00:22:14.014 "base_bdevs_list": [ 00:22:14.014 { 00:22:14.014 "name": "pt1", 00:22:14.014 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:14.014 "is_configured": true, 00:22:14.014 "data_offset": 2048, 00:22:14.014 "data_size": 63488 00:22:14.014 }, 00:22:14.014 { 00:22:14.014 "name": "pt2", 00:22:14.014 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:14.014 "is_configured": true, 00:22:14.014 "data_offset": 2048, 00:22:14.014 "data_size": 63488 00:22:14.014 }, 00:22:14.014 { 00:22:14.014 "name": "pt3", 00:22:14.014 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:14.014 "is_configured": true, 00:22:14.014 "data_offset": 2048, 00:22:14.014 "data_size": 63488 00:22:14.014 }, 00:22:14.014 { 00:22:14.014 "name": "pt4", 00:22:14.014 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:14.014 "is_configured": true, 00:22:14.014 "data_offset": 2048, 00:22:14.014 "data_size": 63488 00:22:14.014 } 00:22:14.014 ] 00:22:14.014 }' 00:22:14.014 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.014 13:48:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:14.592 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:14.592 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:14.592 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:14.592 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:14.592 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:14.592 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:14.592 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:14.592 13:48:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:14.853 [2024-07-12 13:48:03.214434] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:14.853 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:14.853 "name": "raid_bdev1", 00:22:14.853 "aliases": [ 00:22:14.853 "25dfad31-718e-4688-8e39-43b8820aea95" 00:22:14.853 ], 00:22:14.853 "product_name": "Raid Volume", 00:22:14.853 "block_size": 512, 00:22:14.853 "num_blocks": 63488, 00:22:14.853 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:14.853 "assigned_rate_limits": { 00:22:14.853 "rw_ios_per_sec": 0, 00:22:14.853 "rw_mbytes_per_sec": 0, 00:22:14.853 "r_mbytes_per_sec": 0, 00:22:14.853 "w_mbytes_per_sec": 0 00:22:14.853 }, 00:22:14.853 "claimed": false, 00:22:14.853 "zoned": false, 00:22:14.853 "supported_io_types": { 00:22:14.853 "read": true, 00:22:14.853 "write": true, 00:22:14.853 "unmap": false, 00:22:14.853 "flush": false, 00:22:14.853 "reset": true, 00:22:14.853 "nvme_admin": false, 00:22:14.853 "nvme_io": false, 00:22:14.853 "nvme_io_md": false, 00:22:14.853 "write_zeroes": true, 00:22:14.853 "zcopy": false, 00:22:14.853 "get_zone_info": false, 00:22:14.853 "zone_management": false, 00:22:14.853 "zone_append": false, 00:22:14.853 "compare": false, 00:22:14.853 "compare_and_write": false, 00:22:14.853 "abort": false, 00:22:14.853 "seek_hole": false, 00:22:14.853 "seek_data": false, 00:22:14.853 "copy": false, 00:22:14.853 "nvme_iov_md": false 00:22:14.853 }, 00:22:14.853 "memory_domains": [ 00:22:14.853 { 00:22:14.853 "dma_device_id": "system", 00:22:14.853 "dma_device_type": 1 00:22:14.853 }, 00:22:14.853 { 00:22:14.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.853 "dma_device_type": 2 00:22:14.853 }, 00:22:14.853 { 00:22:14.853 "dma_device_id": "system", 00:22:14.853 "dma_device_type": 1 00:22:14.853 }, 00:22:14.853 { 00:22:14.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.853 "dma_device_type": 2 00:22:14.853 }, 00:22:14.853 { 00:22:14.853 "dma_device_id": "system", 00:22:14.853 "dma_device_type": 1 00:22:14.853 }, 00:22:14.853 { 00:22:14.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.853 "dma_device_type": 2 00:22:14.853 }, 00:22:14.853 { 00:22:14.853 "dma_device_id": "system", 00:22:14.853 "dma_device_type": 1 00:22:14.853 }, 00:22:14.853 { 00:22:14.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.853 "dma_device_type": 2 00:22:14.853 } 00:22:14.853 ], 00:22:14.853 "driver_specific": { 00:22:14.853 "raid": { 00:22:14.853 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:14.853 "strip_size_kb": 0, 00:22:14.853 "state": "online", 00:22:14.853 "raid_level": "raid1", 00:22:14.853 "superblock": true, 00:22:14.853 "num_base_bdevs": 4, 00:22:14.853 "num_base_bdevs_discovered": 4, 00:22:14.853 "num_base_bdevs_operational": 4, 00:22:14.853 "base_bdevs_list": [ 00:22:14.853 { 00:22:14.853 "name": "pt1", 00:22:14.853 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:14.853 "is_configured": true, 00:22:14.853 "data_offset": 2048, 00:22:14.853 "data_size": 63488 00:22:14.853 }, 00:22:14.853 { 00:22:14.853 "name": "pt2", 00:22:14.853 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:14.853 "is_configured": true, 00:22:14.853 "data_offset": 2048, 00:22:14.853 "data_size": 63488 00:22:14.853 }, 00:22:14.853 { 00:22:14.853 "name": "pt3", 00:22:14.853 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:14.853 "is_configured": true, 00:22:14.853 "data_offset": 2048, 00:22:14.853 "data_size": 63488 00:22:14.853 }, 00:22:14.853 { 00:22:14.853 "name": "pt4", 00:22:14.853 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:14.853 "is_configured": true, 00:22:14.853 "data_offset": 2048, 00:22:14.853 "data_size": 63488 00:22:14.853 } 00:22:14.853 ] 00:22:14.853 } 00:22:14.853 } 00:22:14.853 }' 00:22:14.853 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:14.853 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:14.853 pt2 00:22:14.853 pt3 00:22:14.853 pt4' 00:22:14.853 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:14.853 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:14.853 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:15.111 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:15.111 "name": "pt1", 00:22:15.111 "aliases": [ 00:22:15.111 "00000000-0000-0000-0000-000000000001" 00:22:15.111 ], 00:22:15.111 "product_name": "passthru", 00:22:15.111 "block_size": 512, 00:22:15.111 "num_blocks": 65536, 00:22:15.111 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:15.111 "assigned_rate_limits": { 00:22:15.111 "rw_ios_per_sec": 0, 00:22:15.111 "rw_mbytes_per_sec": 0, 00:22:15.111 "r_mbytes_per_sec": 0, 00:22:15.111 "w_mbytes_per_sec": 0 00:22:15.111 }, 00:22:15.111 "claimed": true, 00:22:15.111 "claim_type": "exclusive_write", 00:22:15.111 "zoned": false, 00:22:15.111 "supported_io_types": { 00:22:15.111 "read": true, 00:22:15.111 "write": true, 00:22:15.111 "unmap": true, 00:22:15.111 "flush": true, 00:22:15.111 "reset": true, 00:22:15.111 "nvme_admin": false, 00:22:15.111 "nvme_io": false, 00:22:15.111 "nvme_io_md": false, 00:22:15.111 "write_zeroes": true, 00:22:15.111 "zcopy": true, 00:22:15.111 "get_zone_info": false, 00:22:15.111 "zone_management": false, 00:22:15.111 "zone_append": false, 00:22:15.111 "compare": false, 00:22:15.111 "compare_and_write": false, 00:22:15.111 "abort": true, 00:22:15.111 "seek_hole": false, 00:22:15.111 "seek_data": false, 00:22:15.111 "copy": true, 00:22:15.111 "nvme_iov_md": false 00:22:15.111 }, 00:22:15.111 "memory_domains": [ 00:22:15.111 { 00:22:15.111 "dma_device_id": "system", 00:22:15.111 "dma_device_type": 1 00:22:15.111 }, 00:22:15.112 { 00:22:15.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:15.112 "dma_device_type": 2 00:22:15.112 } 00:22:15.112 ], 00:22:15.112 "driver_specific": { 00:22:15.112 "passthru": { 00:22:15.112 "name": "pt1", 00:22:15.112 "base_bdev_name": "malloc1" 00:22:15.112 } 00:22:15.112 } 00:22:15.112 }' 00:22:15.112 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.112 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.112 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:15.112 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.112 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.370 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:15.370 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:15.370 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:15.370 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:15.370 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:15.370 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:15.370 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:15.370 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:15.370 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:15.370 13:48:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:15.628 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:15.628 "name": "pt2", 00:22:15.628 "aliases": [ 00:22:15.628 "00000000-0000-0000-0000-000000000002" 00:22:15.628 ], 00:22:15.628 "product_name": "passthru", 00:22:15.628 "block_size": 512, 00:22:15.628 "num_blocks": 65536, 00:22:15.628 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:15.628 "assigned_rate_limits": { 00:22:15.628 "rw_ios_per_sec": 0, 00:22:15.628 "rw_mbytes_per_sec": 0, 00:22:15.628 "r_mbytes_per_sec": 0, 00:22:15.628 "w_mbytes_per_sec": 0 00:22:15.628 }, 00:22:15.628 "claimed": true, 00:22:15.628 "claim_type": "exclusive_write", 00:22:15.628 "zoned": false, 00:22:15.628 "supported_io_types": { 00:22:15.628 "read": true, 00:22:15.628 "write": true, 00:22:15.628 "unmap": true, 00:22:15.628 "flush": true, 00:22:15.628 "reset": true, 00:22:15.628 "nvme_admin": false, 00:22:15.628 "nvme_io": false, 00:22:15.629 "nvme_io_md": false, 00:22:15.629 "write_zeroes": true, 00:22:15.629 "zcopy": true, 00:22:15.629 "get_zone_info": false, 00:22:15.629 "zone_management": false, 00:22:15.629 "zone_append": false, 00:22:15.629 "compare": false, 00:22:15.629 "compare_and_write": false, 00:22:15.629 "abort": true, 00:22:15.629 "seek_hole": false, 00:22:15.629 "seek_data": false, 00:22:15.629 "copy": true, 00:22:15.629 "nvme_iov_md": false 00:22:15.629 }, 00:22:15.629 "memory_domains": [ 00:22:15.629 { 00:22:15.629 "dma_device_id": "system", 00:22:15.629 "dma_device_type": 1 00:22:15.629 }, 00:22:15.629 { 00:22:15.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:15.629 "dma_device_type": 2 00:22:15.629 } 00:22:15.629 ], 00:22:15.629 "driver_specific": { 00:22:15.629 "passthru": { 00:22:15.629 "name": "pt2", 00:22:15.629 "base_bdev_name": "malloc2" 00:22:15.629 } 00:22:15.629 } 00:22:15.629 }' 00:22:15.629 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.629 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.629 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:15.629 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:15.887 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:16.147 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:16.147 "name": "pt3", 00:22:16.147 "aliases": [ 00:22:16.147 "00000000-0000-0000-0000-000000000003" 00:22:16.147 ], 00:22:16.147 "product_name": "passthru", 00:22:16.147 "block_size": 512, 00:22:16.147 "num_blocks": 65536, 00:22:16.147 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:16.147 "assigned_rate_limits": { 00:22:16.147 "rw_ios_per_sec": 0, 00:22:16.147 "rw_mbytes_per_sec": 0, 00:22:16.147 "r_mbytes_per_sec": 0, 00:22:16.147 "w_mbytes_per_sec": 0 00:22:16.147 }, 00:22:16.147 "claimed": true, 00:22:16.147 "claim_type": "exclusive_write", 00:22:16.147 "zoned": false, 00:22:16.147 "supported_io_types": { 00:22:16.147 "read": true, 00:22:16.147 "write": true, 00:22:16.147 "unmap": true, 00:22:16.147 "flush": true, 00:22:16.147 "reset": true, 00:22:16.147 "nvme_admin": false, 00:22:16.147 "nvme_io": false, 00:22:16.147 "nvme_io_md": false, 00:22:16.147 "write_zeroes": true, 00:22:16.147 "zcopy": true, 00:22:16.147 "get_zone_info": false, 00:22:16.147 "zone_management": false, 00:22:16.147 "zone_append": false, 00:22:16.147 "compare": false, 00:22:16.147 "compare_and_write": false, 00:22:16.147 "abort": true, 00:22:16.147 "seek_hole": false, 00:22:16.147 "seek_data": false, 00:22:16.147 "copy": true, 00:22:16.147 "nvme_iov_md": false 00:22:16.147 }, 00:22:16.147 "memory_domains": [ 00:22:16.147 { 00:22:16.147 "dma_device_id": "system", 00:22:16.147 "dma_device_type": 1 00:22:16.147 }, 00:22:16.147 { 00:22:16.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.147 "dma_device_type": 2 00:22:16.147 } 00:22:16.147 ], 00:22:16.147 "driver_specific": { 00:22:16.147 "passthru": { 00:22:16.147 "name": "pt3", 00:22:16.147 "base_bdev_name": "malloc3" 00:22:16.147 } 00:22:16.147 } 00:22:16.147 }' 00:22:16.147 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:16.406 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:16.406 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:16.406 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:16.406 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:16.406 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:16.406 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:16.406 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:16.406 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:16.406 13:48:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:16.665 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:16.665 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:16.665 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:16.665 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:16.665 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:16.925 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:16.925 "name": "pt4", 00:22:16.925 "aliases": [ 00:22:16.925 "00000000-0000-0000-0000-000000000004" 00:22:16.925 ], 00:22:16.925 "product_name": "passthru", 00:22:16.925 "block_size": 512, 00:22:16.925 "num_blocks": 65536, 00:22:16.925 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:16.925 "assigned_rate_limits": { 00:22:16.925 "rw_ios_per_sec": 0, 00:22:16.925 "rw_mbytes_per_sec": 0, 00:22:16.925 "r_mbytes_per_sec": 0, 00:22:16.925 "w_mbytes_per_sec": 0 00:22:16.925 }, 00:22:16.925 "claimed": true, 00:22:16.925 "claim_type": "exclusive_write", 00:22:16.925 "zoned": false, 00:22:16.925 "supported_io_types": { 00:22:16.925 "read": true, 00:22:16.925 "write": true, 00:22:16.925 "unmap": true, 00:22:16.925 "flush": true, 00:22:16.925 "reset": true, 00:22:16.925 "nvme_admin": false, 00:22:16.925 "nvme_io": false, 00:22:16.925 "nvme_io_md": false, 00:22:16.925 "write_zeroes": true, 00:22:16.925 "zcopy": true, 00:22:16.925 "get_zone_info": false, 00:22:16.925 "zone_management": false, 00:22:16.925 "zone_append": false, 00:22:16.925 "compare": false, 00:22:16.925 "compare_and_write": false, 00:22:16.925 "abort": true, 00:22:16.925 "seek_hole": false, 00:22:16.925 "seek_data": false, 00:22:16.925 "copy": true, 00:22:16.925 "nvme_iov_md": false 00:22:16.925 }, 00:22:16.925 "memory_domains": [ 00:22:16.925 { 00:22:16.925 "dma_device_id": "system", 00:22:16.925 "dma_device_type": 1 00:22:16.925 }, 00:22:16.925 { 00:22:16.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.925 "dma_device_type": 2 00:22:16.925 } 00:22:16.925 ], 00:22:16.925 "driver_specific": { 00:22:16.925 "passthru": { 00:22:16.925 "name": "pt4", 00:22:16.925 "base_bdev_name": "malloc4" 00:22:16.925 } 00:22:16.925 } 00:22:16.925 }' 00:22:16.925 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:16.925 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:16.925 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:16.925 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:16.925 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:16.925 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:16.925 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:16.925 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:17.184 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:17.185 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:17.185 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:17.185 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:17.185 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:17.185 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:17.443 [2024-07-12 13:48:05.837422] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:17.443 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=25dfad31-718e-4688-8e39-43b8820aea95 00:22:17.443 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 25dfad31-718e-4688-8e39-43b8820aea95 ']' 00:22:17.443 13:48:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:17.702 [2024-07-12 13:48:06.089789] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:17.702 [2024-07-12 13:48:06.089818] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:17.702 [2024-07-12 13:48:06.089871] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:17.702 [2024-07-12 13:48:06.089964] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:17.702 [2024-07-12 13:48:06.089978] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2089c20 name raid_bdev1, state offline 00:22:17.702 13:48:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.702 13:48:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:17.961 13:48:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:17.961 13:48:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:17.961 13:48:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:17.961 13:48:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:18.219 13:48:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:18.219 13:48:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:18.495 13:48:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:18.495 13:48:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:18.772 13:48:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:18.772 13:48:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:19.085 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:19.379 [2024-07-12 13:48:07.834327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:19.379 [2024-07-12 13:48:07.835707] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:19.379 [2024-07-12 13:48:07.835751] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:22:19.379 [2024-07-12 13:48:07.835786] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:22:19.379 [2024-07-12 13:48:07.835830] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:19.379 [2024-07-12 13:48:07.835870] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:19.379 [2024-07-12 13:48:07.835893] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:22:19.379 [2024-07-12 13:48:07.835915] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:22:19.379 [2024-07-12 13:48:07.835944] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:19.379 [2024-07-12 13:48:07.835959] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2129a40 name raid_bdev1, state configuring 00:22:19.379 request: 00:22:19.379 { 00:22:19.379 "name": "raid_bdev1", 00:22:19.379 "raid_level": "raid1", 00:22:19.379 "base_bdevs": [ 00:22:19.379 "malloc1", 00:22:19.379 "malloc2", 00:22:19.379 "malloc3", 00:22:19.379 "malloc4" 00:22:19.379 ], 00:22:19.379 "superblock": false, 00:22:19.379 "method": "bdev_raid_create", 00:22:19.379 "req_id": 1 00:22:19.379 } 00:22:19.379 Got JSON-RPC error response 00:22:19.379 response: 00:22:19.379 { 00:22:19.379 "code": -17, 00:22:19.379 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:19.379 } 00:22:19.379 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:22:19.379 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:19.379 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:19.379 13:48:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:19.379 13:48:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.379 13:48:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:19.638 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:19.638 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:19.638 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:19.897 [2024-07-12 13:48:08.327577] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:19.897 [2024-07-12 13:48:08.327627] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:19.897 [2024-07-12 13:48:08.327645] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21266f0 00:22:19.897 [2024-07-12 13:48:08.327659] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:19.897 [2024-07-12 13:48:08.329261] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:19.897 [2024-07-12 13:48:08.329287] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:19.897 [2024-07-12 13:48:08.329358] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:19.897 [2024-07-12 13:48:08.329384] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:19.897 pt1 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.897 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.156 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.156 "name": "raid_bdev1", 00:22:20.156 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:20.156 "strip_size_kb": 0, 00:22:20.156 "state": "configuring", 00:22:20.156 "raid_level": "raid1", 00:22:20.156 "superblock": true, 00:22:20.156 "num_base_bdevs": 4, 00:22:20.156 "num_base_bdevs_discovered": 1, 00:22:20.156 "num_base_bdevs_operational": 4, 00:22:20.156 "base_bdevs_list": [ 00:22:20.156 { 00:22:20.156 "name": "pt1", 00:22:20.156 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:20.156 "is_configured": true, 00:22:20.156 "data_offset": 2048, 00:22:20.156 "data_size": 63488 00:22:20.156 }, 00:22:20.156 { 00:22:20.156 "name": null, 00:22:20.156 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:20.156 "is_configured": false, 00:22:20.156 "data_offset": 2048, 00:22:20.156 "data_size": 63488 00:22:20.156 }, 00:22:20.156 { 00:22:20.156 "name": null, 00:22:20.156 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:20.156 "is_configured": false, 00:22:20.156 "data_offset": 2048, 00:22:20.156 "data_size": 63488 00:22:20.156 }, 00:22:20.156 { 00:22:20.156 "name": null, 00:22:20.156 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:20.156 "is_configured": false, 00:22:20.156 "data_offset": 2048, 00:22:20.156 "data_size": 63488 00:22:20.156 } 00:22:20.156 ] 00:22:20.156 }' 00:22:20.156 13:48:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.156 13:48:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:20.723 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:22:20.723 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:20.982 [2024-07-12 13:48:09.418477] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:20.982 [2024-07-12 13:48:09.418528] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:20.982 [2024-07-12 13:48:09.418547] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21295d0 00:22:20.982 [2024-07-12 13:48:09.418560] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:20.982 [2024-07-12 13:48:09.418894] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:20.982 [2024-07-12 13:48:09.418911] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:20.982 [2024-07-12 13:48:09.418982] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:20.982 [2024-07-12 13:48:09.419002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:20.982 pt2 00:22:20.982 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:21.240 [2024-07-12 13:48:09.667142] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.240 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.499 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.499 "name": "raid_bdev1", 00:22:21.499 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:21.499 "strip_size_kb": 0, 00:22:21.499 "state": "configuring", 00:22:21.499 "raid_level": "raid1", 00:22:21.499 "superblock": true, 00:22:21.499 "num_base_bdevs": 4, 00:22:21.499 "num_base_bdevs_discovered": 1, 00:22:21.499 "num_base_bdevs_operational": 4, 00:22:21.499 "base_bdevs_list": [ 00:22:21.499 { 00:22:21.499 "name": "pt1", 00:22:21.499 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:21.499 "is_configured": true, 00:22:21.499 "data_offset": 2048, 00:22:21.499 "data_size": 63488 00:22:21.499 }, 00:22:21.499 { 00:22:21.499 "name": null, 00:22:21.499 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:21.499 "is_configured": false, 00:22:21.499 "data_offset": 2048, 00:22:21.499 "data_size": 63488 00:22:21.499 }, 00:22:21.499 { 00:22:21.499 "name": null, 00:22:21.499 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:21.499 "is_configured": false, 00:22:21.499 "data_offset": 2048, 00:22:21.499 "data_size": 63488 00:22:21.499 }, 00:22:21.499 { 00:22:21.499 "name": null, 00:22:21.499 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:21.499 "is_configured": false, 00:22:21.499 "data_offset": 2048, 00:22:21.499 "data_size": 63488 00:22:21.499 } 00:22:21.499 ] 00:22:21.499 }' 00:22:21.499 13:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.499 13:48:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:22.065 13:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:22.065 13:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:22.065 13:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:22.324 [2024-07-12 13:48:10.830232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:22.324 [2024-07-12 13:48:10.830285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:22.324 [2024-07-12 13:48:10.830305] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2125430 00:22:22.324 [2024-07-12 13:48:10.830318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:22.324 [2024-07-12 13:48:10.830650] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:22.324 [2024-07-12 13:48:10.830667] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:22.324 [2024-07-12 13:48:10.830732] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:22.324 [2024-07-12 13:48:10.830750] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:22.324 pt2 00:22:22.324 13:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:22.324 13:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:22.324 13:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:22.583 [2024-07-12 13:48:11.074882] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:22.583 [2024-07-12 13:48:11.074924] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:22.583 [2024-07-12 13:48:11.074955] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2129140 00:22:22.583 [2024-07-12 13:48:11.074968] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:22.583 [2024-07-12 13:48:11.075302] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:22.583 [2024-07-12 13:48:11.075320] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:22.583 [2024-07-12 13:48:11.075381] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:22.583 [2024-07-12 13:48:11.075400] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:22.583 pt3 00:22:22.583 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:22.583 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:22.583 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:22.842 [2024-07-12 13:48:11.315516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:22.842 [2024-07-12 13:48:11.315562] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:22.842 [2024-07-12 13:48:11.315579] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2089660 00:22:22.842 [2024-07-12 13:48:11.315591] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:22.842 [2024-07-12 13:48:11.315912] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:22.842 [2024-07-12 13:48:11.315942] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:22.842 [2024-07-12 13:48:11.316030] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:22.842 [2024-07-12 13:48:11.316056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:22.842 [2024-07-12 13:48:11.316182] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x212a4b0 00:22:22.842 [2024-07-12 13:48:11.316192] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:22.842 [2024-07-12 13:48:11.316359] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x212de20 00:22:22.842 [2024-07-12 13:48:11.316492] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x212a4b0 00:22:22.842 [2024-07-12 13:48:11.316501] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x212a4b0 00:22:22.842 [2024-07-12 13:48:11.316598] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:22.842 pt4 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.842 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.100 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.100 "name": "raid_bdev1", 00:22:23.100 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:23.100 "strip_size_kb": 0, 00:22:23.100 "state": "online", 00:22:23.100 "raid_level": "raid1", 00:22:23.100 "superblock": true, 00:22:23.100 "num_base_bdevs": 4, 00:22:23.100 "num_base_bdevs_discovered": 4, 00:22:23.100 "num_base_bdevs_operational": 4, 00:22:23.100 "base_bdevs_list": [ 00:22:23.100 { 00:22:23.100 "name": "pt1", 00:22:23.100 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:23.100 "is_configured": true, 00:22:23.100 "data_offset": 2048, 00:22:23.100 "data_size": 63488 00:22:23.100 }, 00:22:23.100 { 00:22:23.100 "name": "pt2", 00:22:23.100 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:23.100 "is_configured": true, 00:22:23.100 "data_offset": 2048, 00:22:23.100 "data_size": 63488 00:22:23.100 }, 00:22:23.100 { 00:22:23.100 "name": "pt3", 00:22:23.100 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:23.100 "is_configured": true, 00:22:23.100 "data_offset": 2048, 00:22:23.100 "data_size": 63488 00:22:23.100 }, 00:22:23.100 { 00:22:23.100 "name": "pt4", 00:22:23.100 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:23.100 "is_configured": true, 00:22:23.100 "data_offset": 2048, 00:22:23.100 "data_size": 63488 00:22:23.100 } 00:22:23.100 ] 00:22:23.100 }' 00:22:23.100 13:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.100 13:48:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:23.666 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:23.666 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:23.666 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:23.666 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:23.666 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:23.666 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:23.666 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:23.666 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:23.924 [2024-07-12 13:48:12.406735] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:23.924 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:23.924 "name": "raid_bdev1", 00:22:23.924 "aliases": [ 00:22:23.924 "25dfad31-718e-4688-8e39-43b8820aea95" 00:22:23.924 ], 00:22:23.924 "product_name": "Raid Volume", 00:22:23.924 "block_size": 512, 00:22:23.924 "num_blocks": 63488, 00:22:23.925 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:23.925 "assigned_rate_limits": { 00:22:23.925 "rw_ios_per_sec": 0, 00:22:23.925 "rw_mbytes_per_sec": 0, 00:22:23.925 "r_mbytes_per_sec": 0, 00:22:23.925 "w_mbytes_per_sec": 0 00:22:23.925 }, 00:22:23.925 "claimed": false, 00:22:23.925 "zoned": false, 00:22:23.925 "supported_io_types": { 00:22:23.925 "read": true, 00:22:23.925 "write": true, 00:22:23.925 "unmap": false, 00:22:23.925 "flush": false, 00:22:23.925 "reset": true, 00:22:23.925 "nvme_admin": false, 00:22:23.925 "nvme_io": false, 00:22:23.925 "nvme_io_md": false, 00:22:23.925 "write_zeroes": true, 00:22:23.925 "zcopy": false, 00:22:23.925 "get_zone_info": false, 00:22:23.925 "zone_management": false, 00:22:23.925 "zone_append": false, 00:22:23.925 "compare": false, 00:22:23.925 "compare_and_write": false, 00:22:23.925 "abort": false, 00:22:23.925 "seek_hole": false, 00:22:23.925 "seek_data": false, 00:22:23.925 "copy": false, 00:22:23.925 "nvme_iov_md": false 00:22:23.925 }, 00:22:23.925 "memory_domains": [ 00:22:23.925 { 00:22:23.925 "dma_device_id": "system", 00:22:23.925 "dma_device_type": 1 00:22:23.925 }, 00:22:23.925 { 00:22:23.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.925 "dma_device_type": 2 00:22:23.925 }, 00:22:23.925 { 00:22:23.925 "dma_device_id": "system", 00:22:23.925 "dma_device_type": 1 00:22:23.925 }, 00:22:23.925 { 00:22:23.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.925 "dma_device_type": 2 00:22:23.925 }, 00:22:23.925 { 00:22:23.925 "dma_device_id": "system", 00:22:23.925 "dma_device_type": 1 00:22:23.925 }, 00:22:23.925 { 00:22:23.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.925 "dma_device_type": 2 00:22:23.925 }, 00:22:23.925 { 00:22:23.925 "dma_device_id": "system", 00:22:23.925 "dma_device_type": 1 00:22:23.925 }, 00:22:23.925 { 00:22:23.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.925 "dma_device_type": 2 00:22:23.925 } 00:22:23.925 ], 00:22:23.925 "driver_specific": { 00:22:23.925 "raid": { 00:22:23.925 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:23.925 "strip_size_kb": 0, 00:22:23.925 "state": "online", 00:22:23.925 "raid_level": "raid1", 00:22:23.925 "superblock": true, 00:22:23.925 "num_base_bdevs": 4, 00:22:23.925 "num_base_bdevs_discovered": 4, 00:22:23.925 "num_base_bdevs_operational": 4, 00:22:23.925 "base_bdevs_list": [ 00:22:23.925 { 00:22:23.925 "name": "pt1", 00:22:23.925 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:23.925 "is_configured": true, 00:22:23.925 "data_offset": 2048, 00:22:23.925 "data_size": 63488 00:22:23.925 }, 00:22:23.925 { 00:22:23.925 "name": "pt2", 00:22:23.925 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:23.925 "is_configured": true, 00:22:23.925 "data_offset": 2048, 00:22:23.925 "data_size": 63488 00:22:23.925 }, 00:22:23.925 { 00:22:23.925 "name": "pt3", 00:22:23.925 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:23.925 "is_configured": true, 00:22:23.925 "data_offset": 2048, 00:22:23.925 "data_size": 63488 00:22:23.925 }, 00:22:23.925 { 00:22:23.925 "name": "pt4", 00:22:23.925 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:23.925 "is_configured": true, 00:22:23.925 "data_offset": 2048, 00:22:23.925 "data_size": 63488 00:22:23.925 } 00:22:23.925 ] 00:22:23.925 } 00:22:23.925 } 00:22:23.925 }' 00:22:23.925 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:23.925 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:23.925 pt2 00:22:23.925 pt3 00:22:23.925 pt4' 00:22:23.925 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:23.925 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:23.925 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:24.491 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:24.491 "name": "pt1", 00:22:24.491 "aliases": [ 00:22:24.491 "00000000-0000-0000-0000-000000000001" 00:22:24.491 ], 00:22:24.491 "product_name": "passthru", 00:22:24.491 "block_size": 512, 00:22:24.491 "num_blocks": 65536, 00:22:24.491 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:24.491 "assigned_rate_limits": { 00:22:24.491 "rw_ios_per_sec": 0, 00:22:24.491 "rw_mbytes_per_sec": 0, 00:22:24.491 "r_mbytes_per_sec": 0, 00:22:24.491 "w_mbytes_per_sec": 0 00:22:24.491 }, 00:22:24.491 "claimed": true, 00:22:24.491 "claim_type": "exclusive_write", 00:22:24.491 "zoned": false, 00:22:24.491 "supported_io_types": { 00:22:24.491 "read": true, 00:22:24.491 "write": true, 00:22:24.491 "unmap": true, 00:22:24.491 "flush": true, 00:22:24.491 "reset": true, 00:22:24.491 "nvme_admin": false, 00:22:24.491 "nvme_io": false, 00:22:24.491 "nvme_io_md": false, 00:22:24.491 "write_zeroes": true, 00:22:24.491 "zcopy": true, 00:22:24.491 "get_zone_info": false, 00:22:24.491 "zone_management": false, 00:22:24.491 "zone_append": false, 00:22:24.491 "compare": false, 00:22:24.491 "compare_and_write": false, 00:22:24.491 "abort": true, 00:22:24.491 "seek_hole": false, 00:22:24.491 "seek_data": false, 00:22:24.491 "copy": true, 00:22:24.491 "nvme_iov_md": false 00:22:24.491 }, 00:22:24.491 "memory_domains": [ 00:22:24.491 { 00:22:24.491 "dma_device_id": "system", 00:22:24.491 "dma_device_type": 1 00:22:24.491 }, 00:22:24.491 { 00:22:24.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.491 "dma_device_type": 2 00:22:24.491 } 00:22:24.491 ], 00:22:24.491 "driver_specific": { 00:22:24.491 "passthru": { 00:22:24.491 "name": "pt1", 00:22:24.491 "base_bdev_name": "malloc1" 00:22:24.491 } 00:22:24.491 } 00:22:24.491 }' 00:22:24.491 13:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.491 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.750 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:24.750 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.750 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.750 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:24.750 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.750 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.750 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:24.750 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.008 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.008 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:25.008 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:25.008 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:25.008 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:25.575 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:25.575 "name": "pt2", 00:22:25.575 "aliases": [ 00:22:25.575 "00000000-0000-0000-0000-000000000002" 00:22:25.575 ], 00:22:25.575 "product_name": "passthru", 00:22:25.575 "block_size": 512, 00:22:25.575 "num_blocks": 65536, 00:22:25.575 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:25.575 "assigned_rate_limits": { 00:22:25.575 "rw_ios_per_sec": 0, 00:22:25.575 "rw_mbytes_per_sec": 0, 00:22:25.575 "r_mbytes_per_sec": 0, 00:22:25.575 "w_mbytes_per_sec": 0 00:22:25.575 }, 00:22:25.575 "claimed": true, 00:22:25.575 "claim_type": "exclusive_write", 00:22:25.575 "zoned": false, 00:22:25.575 "supported_io_types": { 00:22:25.575 "read": true, 00:22:25.575 "write": true, 00:22:25.575 "unmap": true, 00:22:25.575 "flush": true, 00:22:25.575 "reset": true, 00:22:25.575 "nvme_admin": false, 00:22:25.575 "nvme_io": false, 00:22:25.575 "nvme_io_md": false, 00:22:25.575 "write_zeroes": true, 00:22:25.575 "zcopy": true, 00:22:25.575 "get_zone_info": false, 00:22:25.575 "zone_management": false, 00:22:25.575 "zone_append": false, 00:22:25.575 "compare": false, 00:22:25.575 "compare_and_write": false, 00:22:25.575 "abort": true, 00:22:25.575 "seek_hole": false, 00:22:25.575 "seek_data": false, 00:22:25.575 "copy": true, 00:22:25.575 "nvme_iov_md": false 00:22:25.575 }, 00:22:25.575 "memory_domains": [ 00:22:25.575 { 00:22:25.575 "dma_device_id": "system", 00:22:25.575 "dma_device_type": 1 00:22:25.575 }, 00:22:25.575 { 00:22:25.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.575 "dma_device_type": 2 00:22:25.575 } 00:22:25.575 ], 00:22:25.575 "driver_specific": { 00:22:25.575 "passthru": { 00:22:25.575 "name": "pt2", 00:22:25.575 "base_bdev_name": "malloc2" 00:22:25.575 } 00:22:25.575 } 00:22:25.575 }' 00:22:25.575 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.575 13:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.575 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:25.575 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.575 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.575 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:25.575 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.834 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.834 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:25.834 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.834 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.834 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:25.834 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:25.834 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:25.834 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:26.401 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:26.401 "name": "pt3", 00:22:26.401 "aliases": [ 00:22:26.401 "00000000-0000-0000-0000-000000000003" 00:22:26.401 ], 00:22:26.401 "product_name": "passthru", 00:22:26.401 "block_size": 512, 00:22:26.401 "num_blocks": 65536, 00:22:26.401 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:26.401 "assigned_rate_limits": { 00:22:26.401 "rw_ios_per_sec": 0, 00:22:26.401 "rw_mbytes_per_sec": 0, 00:22:26.402 "r_mbytes_per_sec": 0, 00:22:26.402 "w_mbytes_per_sec": 0 00:22:26.402 }, 00:22:26.402 "claimed": true, 00:22:26.402 "claim_type": "exclusive_write", 00:22:26.402 "zoned": false, 00:22:26.402 "supported_io_types": { 00:22:26.402 "read": true, 00:22:26.402 "write": true, 00:22:26.402 "unmap": true, 00:22:26.402 "flush": true, 00:22:26.402 "reset": true, 00:22:26.402 "nvme_admin": false, 00:22:26.402 "nvme_io": false, 00:22:26.402 "nvme_io_md": false, 00:22:26.402 "write_zeroes": true, 00:22:26.402 "zcopy": true, 00:22:26.402 "get_zone_info": false, 00:22:26.402 "zone_management": false, 00:22:26.402 "zone_append": false, 00:22:26.402 "compare": false, 00:22:26.402 "compare_and_write": false, 00:22:26.402 "abort": true, 00:22:26.402 "seek_hole": false, 00:22:26.402 "seek_data": false, 00:22:26.402 "copy": true, 00:22:26.402 "nvme_iov_md": false 00:22:26.402 }, 00:22:26.402 "memory_domains": [ 00:22:26.402 { 00:22:26.402 "dma_device_id": "system", 00:22:26.402 "dma_device_type": 1 00:22:26.402 }, 00:22:26.402 { 00:22:26.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.402 "dma_device_type": 2 00:22:26.402 } 00:22:26.402 ], 00:22:26.402 "driver_specific": { 00:22:26.402 "passthru": { 00:22:26.402 "name": "pt3", 00:22:26.402 "base_bdev_name": "malloc3" 00:22:26.402 } 00:22:26.402 } 00:22:26.402 }' 00:22:26.402 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.402 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.660 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:26.660 13:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.660 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.660 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:26.660 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.660 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.660 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:26.660 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.919 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.919 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:26.919 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:26.919 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:26.919 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:27.486 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:27.486 "name": "pt4", 00:22:27.486 "aliases": [ 00:22:27.486 "00000000-0000-0000-0000-000000000004" 00:22:27.486 ], 00:22:27.486 "product_name": "passthru", 00:22:27.486 "block_size": 512, 00:22:27.486 "num_blocks": 65536, 00:22:27.486 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:27.486 "assigned_rate_limits": { 00:22:27.486 "rw_ios_per_sec": 0, 00:22:27.486 "rw_mbytes_per_sec": 0, 00:22:27.486 "r_mbytes_per_sec": 0, 00:22:27.486 "w_mbytes_per_sec": 0 00:22:27.486 }, 00:22:27.486 "claimed": true, 00:22:27.486 "claim_type": "exclusive_write", 00:22:27.486 "zoned": false, 00:22:27.486 "supported_io_types": { 00:22:27.486 "read": true, 00:22:27.486 "write": true, 00:22:27.486 "unmap": true, 00:22:27.486 "flush": true, 00:22:27.486 "reset": true, 00:22:27.486 "nvme_admin": false, 00:22:27.486 "nvme_io": false, 00:22:27.486 "nvme_io_md": false, 00:22:27.486 "write_zeroes": true, 00:22:27.486 "zcopy": true, 00:22:27.486 "get_zone_info": false, 00:22:27.486 "zone_management": false, 00:22:27.486 "zone_append": false, 00:22:27.486 "compare": false, 00:22:27.486 "compare_and_write": false, 00:22:27.486 "abort": true, 00:22:27.486 "seek_hole": false, 00:22:27.486 "seek_data": false, 00:22:27.486 "copy": true, 00:22:27.486 "nvme_iov_md": false 00:22:27.486 }, 00:22:27.486 "memory_domains": [ 00:22:27.486 { 00:22:27.486 "dma_device_id": "system", 00:22:27.486 "dma_device_type": 1 00:22:27.486 }, 00:22:27.486 { 00:22:27.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:27.486 "dma_device_type": 2 00:22:27.486 } 00:22:27.486 ], 00:22:27.486 "driver_specific": { 00:22:27.486 "passthru": { 00:22:27.486 "name": "pt4", 00:22:27.486 "base_bdev_name": "malloc4" 00:22:27.486 } 00:22:27.486 } 00:22:27.486 }' 00:22:27.486 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.486 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.486 13:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:27.487 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.745 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.745 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:27.745 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.745 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.745 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:27.745 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.745 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:28.004 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:28.004 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:28.004 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:28.573 [2024-07-12 13:48:16.862710] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:28.573 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 25dfad31-718e-4688-8e39-43b8820aea95 '!=' 25dfad31-718e-4688-8e39-43b8820aea95 ']' 00:22:28.573 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:28.573 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:28.573 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:28.573 13:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:28.832 [2024-07-12 13:48:17.379798] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:28.832 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:28.832 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:28.832 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:28.832 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:28.832 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:28.832 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:28.832 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:28.832 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:28.832 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:28.832 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.091 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.091 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.091 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:29.091 "name": "raid_bdev1", 00:22:29.091 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:29.091 "strip_size_kb": 0, 00:22:29.091 "state": "online", 00:22:29.091 "raid_level": "raid1", 00:22:29.091 "superblock": true, 00:22:29.091 "num_base_bdevs": 4, 00:22:29.091 "num_base_bdevs_discovered": 3, 00:22:29.091 "num_base_bdevs_operational": 3, 00:22:29.091 "base_bdevs_list": [ 00:22:29.091 { 00:22:29.091 "name": null, 00:22:29.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:29.091 "is_configured": false, 00:22:29.091 "data_offset": 2048, 00:22:29.091 "data_size": 63488 00:22:29.091 }, 00:22:29.091 { 00:22:29.091 "name": "pt2", 00:22:29.091 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:29.091 "is_configured": true, 00:22:29.091 "data_offset": 2048, 00:22:29.091 "data_size": 63488 00:22:29.091 }, 00:22:29.091 { 00:22:29.091 "name": "pt3", 00:22:29.091 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:29.091 "is_configured": true, 00:22:29.091 "data_offset": 2048, 00:22:29.091 "data_size": 63488 00:22:29.091 }, 00:22:29.091 { 00:22:29.091 "name": "pt4", 00:22:29.091 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:29.091 "is_configured": true, 00:22:29.091 "data_offset": 2048, 00:22:29.091 "data_size": 63488 00:22:29.091 } 00:22:29.091 ] 00:22:29.091 }' 00:22:29.091 13:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:29.091 13:48:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:30.030 13:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:30.030 [2024-07-12 13:48:18.506746] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:30.030 [2024-07-12 13:48:18.506773] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:30.030 [2024-07-12 13:48:18.506826] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:30.030 [2024-07-12 13:48:18.506890] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:30.030 [2024-07-12 13:48:18.506902] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x212a4b0 name raid_bdev1, state offline 00:22:30.030 13:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.030 13:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:30.290 13:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:30.290 13:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:30.290 13:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:30.290 13:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:30.290 13:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:30.549 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:30.549 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:30.549 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:30.808 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:30.808 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:30.808 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:31.066 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:31.066 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:31.066 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:31.066 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:31.066 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:31.325 [2024-07-12 13:48:19.753985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:31.325 [2024-07-12 13:48:19.754027] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:31.325 [2024-07-12 13:48:19.754045] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21253e0 00:22:31.325 [2024-07-12 13:48:19.754058] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:31.325 [2024-07-12 13:48:19.755641] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:31.325 [2024-07-12 13:48:19.755669] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:31.325 [2024-07-12 13:48:19.755731] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:31.325 [2024-07-12 13:48:19.755756] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:31.325 pt2 00:22:31.325 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:31.325 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:31.325 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:31.325 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:31.325 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:31.325 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:31.326 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.326 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.326 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.326 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.326 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.326 13:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:31.585 13:48:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:31.585 "name": "raid_bdev1", 00:22:31.585 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:31.585 "strip_size_kb": 0, 00:22:31.585 "state": "configuring", 00:22:31.585 "raid_level": "raid1", 00:22:31.585 "superblock": true, 00:22:31.585 "num_base_bdevs": 4, 00:22:31.585 "num_base_bdevs_discovered": 1, 00:22:31.585 "num_base_bdevs_operational": 3, 00:22:31.585 "base_bdevs_list": [ 00:22:31.585 { 00:22:31.585 "name": null, 00:22:31.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:31.585 "is_configured": false, 00:22:31.585 "data_offset": 2048, 00:22:31.585 "data_size": 63488 00:22:31.585 }, 00:22:31.585 { 00:22:31.585 "name": "pt2", 00:22:31.585 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:31.585 "is_configured": true, 00:22:31.585 "data_offset": 2048, 00:22:31.585 "data_size": 63488 00:22:31.585 }, 00:22:31.585 { 00:22:31.585 "name": null, 00:22:31.585 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:31.585 "is_configured": false, 00:22:31.585 "data_offset": 2048, 00:22:31.585 "data_size": 63488 00:22:31.585 }, 00:22:31.585 { 00:22:31.585 "name": null, 00:22:31.585 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:31.585 "is_configured": false, 00:22:31.585 "data_offset": 2048, 00:22:31.585 "data_size": 63488 00:22:31.585 } 00:22:31.585 ] 00:22:31.585 }' 00:22:31.585 13:48:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:31.585 13:48:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:32.521 13:48:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:32.521 13:48:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:32.522 13:48:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:32.522 [2024-07-12 13:48:21.093540] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:32.522 [2024-07-12 13:48:21.093592] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.522 [2024-07-12 13:48:21.093610] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2129800 00:22:32.522 [2024-07-12 13:48:21.093623] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.522 [2024-07-12 13:48:21.093965] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.522 [2024-07-12 13:48:21.093983] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:32.522 [2024-07-12 13:48:21.094044] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:32.522 [2024-07-12 13:48:21.094062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:32.522 pt3 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.781 "name": "raid_bdev1", 00:22:32.781 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:32.781 "strip_size_kb": 0, 00:22:32.781 "state": "configuring", 00:22:32.781 "raid_level": "raid1", 00:22:32.781 "superblock": true, 00:22:32.781 "num_base_bdevs": 4, 00:22:32.781 "num_base_bdevs_discovered": 2, 00:22:32.781 "num_base_bdevs_operational": 3, 00:22:32.781 "base_bdevs_list": [ 00:22:32.781 { 00:22:32.781 "name": null, 00:22:32.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.781 "is_configured": false, 00:22:32.781 "data_offset": 2048, 00:22:32.781 "data_size": 63488 00:22:32.781 }, 00:22:32.781 { 00:22:32.781 "name": "pt2", 00:22:32.781 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:32.781 "is_configured": true, 00:22:32.781 "data_offset": 2048, 00:22:32.781 "data_size": 63488 00:22:32.781 }, 00:22:32.781 { 00:22:32.781 "name": "pt3", 00:22:32.781 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:32.781 "is_configured": true, 00:22:32.781 "data_offset": 2048, 00:22:32.781 "data_size": 63488 00:22:32.781 }, 00:22:32.781 { 00:22:32.781 "name": null, 00:22:32.781 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:32.781 "is_configured": false, 00:22:32.781 "data_offset": 2048, 00:22:32.781 "data_size": 63488 00:22:32.781 } 00:22:32.781 ] 00:22:32.781 }' 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.781 13:48:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:33.350 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:33.350 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:33.350 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:22:33.350 13:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:33.609 [2024-07-12 13:48:22.104228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:33.609 [2024-07-12 13:48:22.104278] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:33.609 [2024-07-12 13:48:22.104300] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2128fa0 00:22:33.609 [2024-07-12 13:48:22.104313] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:33.609 [2024-07-12 13:48:22.104648] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:33.609 [2024-07-12 13:48:22.104664] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:33.609 [2024-07-12 13:48:22.104722] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:33.609 [2024-07-12 13:48:22.104741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:33.609 [2024-07-12 13:48:22.104852] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x207d880 00:22:33.609 [2024-07-12 13:48:22.104863] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:33.609 [2024-07-12 13:48:22.105039] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x212fca0 00:22:33.609 [2024-07-12 13:48:22.105169] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x207d880 00:22:33.609 [2024-07-12 13:48:22.105179] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x207d880 00:22:33.609 [2024-07-12 13:48:22.105274] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:33.609 pt4 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.609 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:33.869 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.869 "name": "raid_bdev1", 00:22:33.869 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:33.869 "strip_size_kb": 0, 00:22:33.869 "state": "online", 00:22:33.869 "raid_level": "raid1", 00:22:33.869 "superblock": true, 00:22:33.869 "num_base_bdevs": 4, 00:22:33.869 "num_base_bdevs_discovered": 3, 00:22:33.869 "num_base_bdevs_operational": 3, 00:22:33.869 "base_bdevs_list": [ 00:22:33.869 { 00:22:33.869 "name": null, 00:22:33.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.869 "is_configured": false, 00:22:33.869 "data_offset": 2048, 00:22:33.869 "data_size": 63488 00:22:33.869 }, 00:22:33.869 { 00:22:33.869 "name": "pt2", 00:22:33.869 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:33.869 "is_configured": true, 00:22:33.869 "data_offset": 2048, 00:22:33.869 "data_size": 63488 00:22:33.869 }, 00:22:33.869 { 00:22:33.869 "name": "pt3", 00:22:33.869 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:33.869 "is_configured": true, 00:22:33.869 "data_offset": 2048, 00:22:33.869 "data_size": 63488 00:22:33.869 }, 00:22:33.869 { 00:22:33.869 "name": "pt4", 00:22:33.869 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:33.869 "is_configured": true, 00:22:33.869 "data_offset": 2048, 00:22:33.869 "data_size": 63488 00:22:33.869 } 00:22:33.869 ] 00:22:33.869 }' 00:22:33.869 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.869 13:48:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:34.437 13:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:34.696 [2024-07-12 13:48:23.167036] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:34.696 [2024-07-12 13:48:23.167062] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:34.696 [2024-07-12 13:48:23.167114] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:34.696 [2024-07-12 13:48:23.167176] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:34.696 [2024-07-12 13:48:23.167188] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x207d880 name raid_bdev1, state offline 00:22:34.696 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.696 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:34.954 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:34.955 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:34.955 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:22:34.955 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:22:34.955 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:35.213 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:35.474 [2024-07-12 13:48:23.904964] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:35.474 [2024-07-12 13:48:23.905008] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:35.474 [2024-07-12 13:48:23.905026] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2089960 00:22:35.474 [2024-07-12 13:48:23.905038] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:35.474 [2024-07-12 13:48:23.906631] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:35.474 [2024-07-12 13:48:23.906659] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:35.474 [2024-07-12 13:48:23.906727] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:35.474 [2024-07-12 13:48:23.906752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:35.474 [2024-07-12 13:48:23.906848] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:35.474 [2024-07-12 13:48:23.906861] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:35.474 [2024-07-12 13:48:23.906875] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x212da30 name raid_bdev1, state configuring 00:22:35.474 [2024-07-12 13:48:23.906899] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:35.474 [2024-07-12 13:48:23.906983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:35.474 pt1 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.474 13:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.733 13:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.733 "name": "raid_bdev1", 00:22:35.734 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:35.734 "strip_size_kb": 0, 00:22:35.734 "state": "configuring", 00:22:35.734 "raid_level": "raid1", 00:22:35.734 "superblock": true, 00:22:35.734 "num_base_bdevs": 4, 00:22:35.734 "num_base_bdevs_discovered": 2, 00:22:35.734 "num_base_bdevs_operational": 3, 00:22:35.734 "base_bdevs_list": [ 00:22:35.734 { 00:22:35.734 "name": null, 00:22:35.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.734 "is_configured": false, 00:22:35.734 "data_offset": 2048, 00:22:35.734 "data_size": 63488 00:22:35.734 }, 00:22:35.734 { 00:22:35.734 "name": "pt2", 00:22:35.734 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:35.734 "is_configured": true, 00:22:35.734 "data_offset": 2048, 00:22:35.734 "data_size": 63488 00:22:35.734 }, 00:22:35.734 { 00:22:35.734 "name": "pt3", 00:22:35.734 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:35.734 "is_configured": true, 00:22:35.734 "data_offset": 2048, 00:22:35.734 "data_size": 63488 00:22:35.734 }, 00:22:35.734 { 00:22:35.734 "name": null, 00:22:35.734 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:35.734 "is_configured": false, 00:22:35.734 "data_offset": 2048, 00:22:35.734 "data_size": 63488 00:22:35.734 } 00:22:35.734 ] 00:22:35.734 }' 00:22:35.734 13:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.734 13:48:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:36.301 13:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:36.301 13:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:22:36.560 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:22:36.560 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:36.820 [2024-07-12 13:48:25.252557] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:36.820 [2024-07-12 13:48:25.252605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:36.820 [2024-07-12 13:48:25.252626] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2088bf0 00:22:36.820 [2024-07-12 13:48:25.252639] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:36.820 [2024-07-12 13:48:25.252992] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:36.820 [2024-07-12 13:48:25.253010] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:36.820 [2024-07-12 13:48:25.253073] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:36.820 [2024-07-12 13:48:25.253093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:36.821 [2024-07-12 13:48:25.253207] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2088f10 00:22:36.821 [2024-07-12 13:48:25.253217] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:36.821 [2024-07-12 13:48:25.253383] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x212def0 00:22:36.821 [2024-07-12 13:48:25.253513] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2088f10 00:22:36.821 [2024-07-12 13:48:25.253523] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2088f10 00:22:36.821 [2024-07-12 13:48:25.253620] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:36.821 pt4 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.821 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.080 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.080 "name": "raid_bdev1", 00:22:37.080 "uuid": "25dfad31-718e-4688-8e39-43b8820aea95", 00:22:37.080 "strip_size_kb": 0, 00:22:37.080 "state": "online", 00:22:37.080 "raid_level": "raid1", 00:22:37.080 "superblock": true, 00:22:37.080 "num_base_bdevs": 4, 00:22:37.080 "num_base_bdevs_discovered": 3, 00:22:37.080 "num_base_bdevs_operational": 3, 00:22:37.080 "base_bdevs_list": [ 00:22:37.080 { 00:22:37.080 "name": null, 00:22:37.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.080 "is_configured": false, 00:22:37.080 "data_offset": 2048, 00:22:37.080 "data_size": 63488 00:22:37.080 }, 00:22:37.080 { 00:22:37.080 "name": "pt2", 00:22:37.080 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:37.080 "is_configured": true, 00:22:37.080 "data_offset": 2048, 00:22:37.080 "data_size": 63488 00:22:37.080 }, 00:22:37.080 { 00:22:37.080 "name": "pt3", 00:22:37.080 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:37.080 "is_configured": true, 00:22:37.080 "data_offset": 2048, 00:22:37.080 "data_size": 63488 00:22:37.080 }, 00:22:37.080 { 00:22:37.080 "name": "pt4", 00:22:37.080 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:37.080 "is_configured": true, 00:22:37.080 "data_offset": 2048, 00:22:37.080 "data_size": 63488 00:22:37.080 } 00:22:37.080 ] 00:22:37.080 }' 00:22:37.080 13:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.080 13:48:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:37.650 13:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:37.650 13:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:37.910 13:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:37.910 13:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:37.910 13:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:38.170 [2024-07-12 13:48:26.576342] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 25dfad31-718e-4688-8e39-43b8820aea95 '!=' 25dfad31-718e-4688-8e39-43b8820aea95 ']' 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 531178 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 531178 ']' 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 531178 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 531178 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 531178' 00:22:38.170 killing process with pid 531178 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 531178 00:22:38.170 [2024-07-12 13:48:26.645706] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:38.170 [2024-07-12 13:48:26.645761] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:38.170 [2024-07-12 13:48:26.645827] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:38.170 [2024-07-12 13:48:26.645839] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2088f10 name raid_bdev1, state offline 00:22:38.170 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 531178 00:22:38.170 [2024-07-12 13:48:26.682691] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:38.429 13:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:38.429 00:22:38.429 real 0m28.000s 00:22:38.429 user 0m51.535s 00:22:38.429 sys 0m4.784s 00:22:38.429 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:38.429 13:48:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:38.429 ************************************ 00:22:38.429 END TEST raid_superblock_test 00:22:38.429 ************************************ 00:22:38.429 13:48:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:38.429 13:48:26 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:22:38.429 13:48:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:38.429 13:48:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:38.429 13:48:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:38.429 ************************************ 00:22:38.429 START TEST raid_read_error_test 00:22:38.429 ************************************ 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:38.429 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:38.430 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:38.430 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:38.430 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:38.430 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:38.430 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:38.430 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:38.430 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:38.430 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:38.430 13:48:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:38.430 13:48:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:38.430 13:48:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.guq0Vp5qtQ 00:22:38.430 13:48:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=535872 00:22:38.430 13:48:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 535872 /var/tmp/spdk-raid.sock 00:22:38.430 13:48:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:38.430 13:48:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 535872 ']' 00:22:38.430 13:48:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:38.430 13:48:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:38.430 13:48:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:38.430 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:38.689 13:48:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:38.689 13:48:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:38.689 [2024-07-12 13:48:27.068131] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:22:38.689 [2024-07-12 13:48:27.068197] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid535872 ] 00:22:38.689 [2024-07-12 13:48:27.190974] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:38.950 [2024-07-12 13:48:27.294511] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:38.950 [2024-07-12 13:48:27.365415] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:38.950 [2024-07-12 13:48:27.365449] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:38.950 13:48:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:38.950 13:48:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:38.950 13:48:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:38.950 13:48:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:39.209 BaseBdev1_malloc 00:22:39.209 13:48:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:39.468 true 00:22:39.468 13:48:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:39.727 [2024-07-12 13:48:28.148021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:39.727 [2024-07-12 13:48:28.148064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:39.727 [2024-07-12 13:48:28.148085] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a0ca10 00:22:39.727 [2024-07-12 13:48:28.148098] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:39.727 [2024-07-12 13:48:28.149951] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:39.727 [2024-07-12 13:48:28.149979] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:39.727 BaseBdev1 00:22:39.727 13:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:39.727 13:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:39.986 BaseBdev2_malloc 00:22:39.986 13:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:40.244 true 00:22:40.244 13:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:40.503 [2024-07-12 13:48:28.874532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:40.503 [2024-07-12 13:48:28.874576] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.503 [2024-07-12 13:48:28.874597] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a11250 00:22:40.503 [2024-07-12 13:48:28.874610] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.503 [2024-07-12 13:48:28.876231] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.503 [2024-07-12 13:48:28.876258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:40.503 BaseBdev2 00:22:40.503 13:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:40.503 13:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:40.762 BaseBdev3_malloc 00:22:40.762 13:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:41.021 true 00:22:41.021 13:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:41.280 [2024-07-12 13:48:29.610385] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:41.280 [2024-07-12 13:48:29.610432] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:41.280 [2024-07-12 13:48:29.610453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a13510 00:22:41.280 [2024-07-12 13:48:29.610465] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:41.280 [2024-07-12 13:48:29.612081] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:41.280 [2024-07-12 13:48:29.612110] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:41.280 BaseBdev3 00:22:41.280 13:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:41.280 13:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:41.280 BaseBdev4_malloc 00:22:41.541 13:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:41.541 true 00:22:41.541 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:41.801 [2024-07-12 13:48:30.340875] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:41.801 [2024-07-12 13:48:30.340922] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:41.801 [2024-07-12 13:48:30.340950] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a143e0 00:22:41.801 [2024-07-12 13:48:30.340963] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:41.801 [2024-07-12 13:48:30.342527] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:41.801 [2024-07-12 13:48:30.342556] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:41.801 BaseBdev4 00:22:41.801 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:42.060 [2024-07-12 13:48:30.585555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:42.060 [2024-07-12 13:48:30.586911] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:42.060 [2024-07-12 13:48:30.586989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:42.060 [2024-07-12 13:48:30.587051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:42.060 [2024-07-12 13:48:30.587281] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a0e560 00:22:42.060 [2024-07-12 13:48:30.587292] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:42.060 [2024-07-12 13:48:30.587488] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1862ba0 00:22:42.060 [2024-07-12 13:48:30.587643] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a0e560 00:22:42.060 [2024-07-12 13:48:30.587654] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a0e560 00:22:42.060 [2024-07-12 13:48:30.587761] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.060 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.320 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.320 "name": "raid_bdev1", 00:22:42.320 "uuid": "7ef1a486-3e43-4cfd-b753-a6b8cd86082f", 00:22:42.320 "strip_size_kb": 0, 00:22:42.320 "state": "online", 00:22:42.320 "raid_level": "raid1", 00:22:42.320 "superblock": true, 00:22:42.320 "num_base_bdevs": 4, 00:22:42.320 "num_base_bdevs_discovered": 4, 00:22:42.320 "num_base_bdevs_operational": 4, 00:22:42.320 "base_bdevs_list": [ 00:22:42.320 { 00:22:42.320 "name": "BaseBdev1", 00:22:42.320 "uuid": "7f673bb8-c83a-550a-aa95-efc66341a191", 00:22:42.320 "is_configured": true, 00:22:42.320 "data_offset": 2048, 00:22:42.320 "data_size": 63488 00:22:42.320 }, 00:22:42.320 { 00:22:42.320 "name": "BaseBdev2", 00:22:42.320 "uuid": "d9fe29e5-04a4-5e94-9b1c-7eee6d0075da", 00:22:42.320 "is_configured": true, 00:22:42.320 "data_offset": 2048, 00:22:42.320 "data_size": 63488 00:22:42.320 }, 00:22:42.320 { 00:22:42.320 "name": "BaseBdev3", 00:22:42.320 "uuid": "15e7893f-24bc-5fb2-998e-9fbd05133a85", 00:22:42.320 "is_configured": true, 00:22:42.320 "data_offset": 2048, 00:22:42.320 "data_size": 63488 00:22:42.320 }, 00:22:42.320 { 00:22:42.320 "name": "BaseBdev4", 00:22:42.320 "uuid": "b1421221-2b9f-54aa-b64f-82267f78445a", 00:22:42.320 "is_configured": true, 00:22:42.320 "data_offset": 2048, 00:22:42.320 "data_size": 63488 00:22:42.320 } 00:22:42.320 ] 00:22:42.320 }' 00:22:42.320 13:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.320 13:48:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:42.889 13:48:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:42.889 13:48:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:43.148 [2024-07-12 13:48:31.536348] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18625a0 00:22:44.086 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.345 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.605 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.605 "name": "raid_bdev1", 00:22:44.605 "uuid": "7ef1a486-3e43-4cfd-b753-a6b8cd86082f", 00:22:44.605 "strip_size_kb": 0, 00:22:44.605 "state": "online", 00:22:44.605 "raid_level": "raid1", 00:22:44.605 "superblock": true, 00:22:44.605 "num_base_bdevs": 4, 00:22:44.605 "num_base_bdevs_discovered": 4, 00:22:44.605 "num_base_bdevs_operational": 4, 00:22:44.605 "base_bdevs_list": [ 00:22:44.605 { 00:22:44.605 "name": "BaseBdev1", 00:22:44.605 "uuid": "7f673bb8-c83a-550a-aa95-efc66341a191", 00:22:44.605 "is_configured": true, 00:22:44.605 "data_offset": 2048, 00:22:44.605 "data_size": 63488 00:22:44.605 }, 00:22:44.605 { 00:22:44.605 "name": "BaseBdev2", 00:22:44.605 "uuid": "d9fe29e5-04a4-5e94-9b1c-7eee6d0075da", 00:22:44.605 "is_configured": true, 00:22:44.605 "data_offset": 2048, 00:22:44.605 "data_size": 63488 00:22:44.605 }, 00:22:44.605 { 00:22:44.605 "name": "BaseBdev3", 00:22:44.605 "uuid": "15e7893f-24bc-5fb2-998e-9fbd05133a85", 00:22:44.605 "is_configured": true, 00:22:44.605 "data_offset": 2048, 00:22:44.605 "data_size": 63488 00:22:44.605 }, 00:22:44.605 { 00:22:44.605 "name": "BaseBdev4", 00:22:44.605 "uuid": "b1421221-2b9f-54aa-b64f-82267f78445a", 00:22:44.605 "is_configured": true, 00:22:44.605 "data_offset": 2048, 00:22:44.605 "data_size": 63488 00:22:44.605 } 00:22:44.605 ] 00:22:44.605 }' 00:22:44.605 13:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.605 13:48:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:45.173 13:48:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:45.433 [2024-07-12 13:48:33.762210] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:45.433 [2024-07-12 13:48:33.762244] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:45.433 [2024-07-12 13:48:33.765614] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:45.433 [2024-07-12 13:48:33.765652] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:45.433 [2024-07-12 13:48:33.765769] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:45.433 [2024-07-12 13:48:33.765781] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a0e560 name raid_bdev1, state offline 00:22:45.433 0 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 535872 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 535872 ']' 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 535872 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 535872 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 535872' 00:22:45.433 killing process with pid 535872 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 535872 00:22:45.433 [2024-07-12 13:48:33.844190] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:45.433 13:48:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 535872 00:22:45.433 [2024-07-12 13:48:33.874260] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:45.693 13:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.guq0Vp5qtQ 00:22:45.693 13:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:45.693 13:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:45.693 13:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:45.693 13:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:45.693 13:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:45.693 13:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:45.693 13:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:45.693 00:22:45.693 real 0m7.107s 00:22:45.693 user 0m11.740s 00:22:45.693 sys 0m1.261s 00:22:45.693 13:48:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:45.693 13:48:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:45.693 ************************************ 00:22:45.693 END TEST raid_read_error_test 00:22:45.693 ************************************ 00:22:45.693 13:48:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:45.693 13:48:34 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:22:45.693 13:48:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:45.693 13:48:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:45.693 13:48:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:45.693 ************************************ 00:22:45.693 START TEST raid_write_error_test 00:22:45.693 ************************************ 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.DjxIr0f3ys 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=536853 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 536853 /var/tmp/spdk-raid.sock 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 536853 ']' 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:45.693 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:45.693 13:48:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:45.693 [2024-07-12 13:48:34.253097] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:22:45.693 [2024-07-12 13:48:34.253160] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid536853 ] 00:22:45.953 [2024-07-12 13:48:34.380511] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:45.953 [2024-07-12 13:48:34.486733] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:46.212 [2024-07-12 13:48:34.551452] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.212 [2024-07-12 13:48:34.551494] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.212 13:48:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:46.212 13:48:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:46.212 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:46.212 13:48:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:46.780 BaseBdev1_malloc 00:22:46.780 13:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:47.348 true 00:22:47.348 13:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:47.917 [2024-07-12 13:48:36.229526] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:47.917 [2024-07-12 13:48:36.229571] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.917 [2024-07-12 13:48:36.229592] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27eda10 00:22:47.917 [2024-07-12 13:48:36.229604] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.917 [2024-07-12 13:48:36.231487] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.917 [2024-07-12 13:48:36.231517] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:47.917 BaseBdev1 00:22:47.917 13:48:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:47.917 13:48:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:48.177 BaseBdev2_malloc 00:22:48.436 13:48:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:48.695 true 00:22:48.954 13:48:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:49.213 [2024-07-12 13:48:37.766622] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:49.213 [2024-07-12 13:48:37.766668] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:49.213 [2024-07-12 13:48:37.766689] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27f2250 00:22:49.213 [2024-07-12 13:48:37.766702] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:49.213 [2024-07-12 13:48:37.768317] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:49.213 [2024-07-12 13:48:37.768344] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:49.213 BaseBdev2 00:22:49.472 13:48:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:49.472 13:48:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:49.731 BaseBdev3_malloc 00:22:49.990 13:48:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:50.250 true 00:22:50.250 13:48:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:50.818 [2024-07-12 13:48:39.311376] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:50.818 [2024-07-12 13:48:39.311421] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:50.818 [2024-07-12 13:48:39.311441] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27f4510 00:22:50.818 [2024-07-12 13:48:39.311454] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:50.818 [2024-07-12 13:48:39.313068] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:50.818 [2024-07-12 13:48:39.313097] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:50.818 BaseBdev3 00:22:50.818 13:48:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:50.818 13:48:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:51.386 BaseBdev4_malloc 00:22:51.386 13:48:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:51.955 true 00:22:51.955 13:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:52.522 [2024-07-12 13:48:40.856006] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:52.522 [2024-07-12 13:48:40.856054] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.522 [2024-07-12 13:48:40.856076] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27f53e0 00:22:52.522 [2024-07-12 13:48:40.856089] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.522 [2024-07-12 13:48:40.857723] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.522 [2024-07-12 13:48:40.857750] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:52.522 BaseBdev4 00:22:52.522 13:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:53.089 [2024-07-12 13:48:41.365531] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:53.089 [2024-07-12 13:48:41.366920] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:53.089 [2024-07-12 13:48:41.367000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:53.089 [2024-07-12 13:48:41.367061] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:53.089 [2024-07-12 13:48:41.367294] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27ef560 00:22:53.089 [2024-07-12 13:48:41.367305] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:53.089 [2024-07-12 13:48:41.367511] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2643ba0 00:22:53.089 [2024-07-12 13:48:41.367669] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27ef560 00:22:53.089 [2024-07-12 13:48:41.367679] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27ef560 00:22:53.089 [2024-07-12 13:48:41.367791] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.089 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:53.405 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:53.405 "name": "raid_bdev1", 00:22:53.405 "uuid": "19d26319-1c42-451f-b3ce-7049389c432a", 00:22:53.405 "strip_size_kb": 0, 00:22:53.405 "state": "online", 00:22:53.405 "raid_level": "raid1", 00:22:53.405 "superblock": true, 00:22:53.405 "num_base_bdevs": 4, 00:22:53.405 "num_base_bdevs_discovered": 4, 00:22:53.405 "num_base_bdevs_operational": 4, 00:22:53.405 "base_bdevs_list": [ 00:22:53.405 { 00:22:53.405 "name": "BaseBdev1", 00:22:53.405 "uuid": "47d03735-6c51-52a9-9569-cd7ed7142caa", 00:22:53.405 "is_configured": true, 00:22:53.405 "data_offset": 2048, 00:22:53.405 "data_size": 63488 00:22:53.405 }, 00:22:53.405 { 00:22:53.405 "name": "BaseBdev2", 00:22:53.405 "uuid": "605eeaf7-dd5e-5526-b162-9a42b2654071", 00:22:53.405 "is_configured": true, 00:22:53.405 "data_offset": 2048, 00:22:53.405 "data_size": 63488 00:22:53.405 }, 00:22:53.405 { 00:22:53.405 "name": "BaseBdev3", 00:22:53.405 "uuid": "2947cac6-3a4c-563b-8c0b-bdc43ff77ca0", 00:22:53.405 "is_configured": true, 00:22:53.405 "data_offset": 2048, 00:22:53.405 "data_size": 63488 00:22:53.405 }, 00:22:53.405 { 00:22:53.405 "name": "BaseBdev4", 00:22:53.405 "uuid": "d99c25f0-ec4e-5434-8963-7bfe5a2b60e6", 00:22:53.405 "is_configured": true, 00:22:53.405 "data_offset": 2048, 00:22:53.405 "data_size": 63488 00:22:53.405 } 00:22:53.405 ] 00:22:53.405 }' 00:22:53.405 13:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:53.405 13:48:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:53.972 13:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:53.972 13:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:54.230 [2024-07-12 13:48:42.713375] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26435a0 00:22:55.165 13:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:55.424 [2024-07-12 13:48:43.985207] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:22:55.424 [2024-07-12 13:48:43.985261] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:55.424 [2024-07-12 13:48:43.985476] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x26435a0 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.683 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.249 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.249 "name": "raid_bdev1", 00:22:56.249 "uuid": "19d26319-1c42-451f-b3ce-7049389c432a", 00:22:56.249 "strip_size_kb": 0, 00:22:56.249 "state": "online", 00:22:56.249 "raid_level": "raid1", 00:22:56.249 "superblock": true, 00:22:56.249 "num_base_bdevs": 4, 00:22:56.249 "num_base_bdevs_discovered": 3, 00:22:56.249 "num_base_bdevs_operational": 3, 00:22:56.249 "base_bdevs_list": [ 00:22:56.249 { 00:22:56.249 "name": null, 00:22:56.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:56.249 "is_configured": false, 00:22:56.249 "data_offset": 2048, 00:22:56.249 "data_size": 63488 00:22:56.249 }, 00:22:56.249 { 00:22:56.249 "name": "BaseBdev2", 00:22:56.249 "uuid": "605eeaf7-dd5e-5526-b162-9a42b2654071", 00:22:56.249 "is_configured": true, 00:22:56.249 "data_offset": 2048, 00:22:56.249 "data_size": 63488 00:22:56.249 }, 00:22:56.249 { 00:22:56.249 "name": "BaseBdev3", 00:22:56.249 "uuid": "2947cac6-3a4c-563b-8c0b-bdc43ff77ca0", 00:22:56.249 "is_configured": true, 00:22:56.249 "data_offset": 2048, 00:22:56.250 "data_size": 63488 00:22:56.250 }, 00:22:56.250 { 00:22:56.250 "name": "BaseBdev4", 00:22:56.250 "uuid": "d99c25f0-ec4e-5434-8963-7bfe5a2b60e6", 00:22:56.250 "is_configured": true, 00:22:56.250 "data_offset": 2048, 00:22:56.250 "data_size": 63488 00:22:56.250 } 00:22:56.250 ] 00:22:56.250 }' 00:22:56.250 13:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.250 13:48:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:56.817 [2024-07-12 13:48:45.311720] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:56.817 [2024-07-12 13:48:45.311761] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:56.817 [2024-07-12 13:48:45.314888] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:56.817 [2024-07-12 13:48:45.314923] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:56.817 [2024-07-12 13:48:45.315027] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:56.817 [2024-07-12 13:48:45.315039] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27ef560 name raid_bdev1, state offline 00:22:56.817 0 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 536853 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 536853 ']' 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 536853 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 536853 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 536853' 00:22:56.817 killing process with pid 536853 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 536853 00:22:56.817 [2024-07-12 13:48:45.383536] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:56.817 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 536853 00:22:57.076 [2024-07-12 13:48:45.415580] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:57.076 13:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.DjxIr0f3ys 00:22:57.076 13:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:57.076 13:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:57.076 13:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:57.076 13:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:57.076 13:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:57.076 13:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:57.076 13:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:57.076 00:22:57.076 real 0m11.478s 00:22:57.076 user 0m19.873s 00:22:57.076 sys 0m1.938s 00:22:57.076 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:57.076 13:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:57.076 ************************************ 00:22:57.076 END TEST raid_write_error_test 00:22:57.076 ************************************ 00:22:57.334 13:48:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:57.335 13:48:45 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:22:57.335 13:48:45 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:22:57.335 13:48:45 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:22:57.335 13:48:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:57.335 13:48:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:57.335 13:48:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:57.335 ************************************ 00:22:57.335 START TEST raid_rebuild_test 00:22:57.335 ************************************ 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=538511 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 538511 /var/tmp/spdk-raid.sock 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 538511 ']' 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:57.335 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:57.335 13:48:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:57.335 [2024-07-12 13:48:45.818498] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:22:57.335 [2024-07-12 13:48:45.818573] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid538511 ] 00:22:57.335 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:57.335 Zero copy mechanism will not be used. 00:22:57.594 [2024-07-12 13:48:45.948283] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:57.594 [2024-07-12 13:48:46.050536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:57.594 [2024-07-12 13:48:46.108413] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:57.594 [2024-07-12 13:48:46.108441] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:58.161 13:48:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:58.161 13:48:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:22:58.161 13:48:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:58.161 13:48:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:58.420 BaseBdev1_malloc 00:22:58.420 13:48:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:58.699 [2024-07-12 13:48:47.168797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:58.699 [2024-07-12 13:48:47.168846] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:58.699 [2024-07-12 13:48:47.168869] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2638680 00:22:58.699 [2024-07-12 13:48:47.168881] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:58.699 [2024-07-12 13:48:47.170475] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:58.699 [2024-07-12 13:48:47.170504] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:58.699 BaseBdev1 00:22:58.699 13:48:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:58.699 13:48:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:58.958 BaseBdev2_malloc 00:22:58.958 13:48:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:59.216 [2024-07-12 13:48:47.678913] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:59.216 [2024-07-12 13:48:47.678964] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:59.216 [2024-07-12 13:48:47.678987] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26391a0 00:22:59.216 [2024-07-12 13:48:47.679000] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:59.216 [2024-07-12 13:48:47.680465] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:59.216 [2024-07-12 13:48:47.680494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:59.216 BaseBdev2 00:22:59.216 13:48:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:59.498 spare_malloc 00:22:59.498 13:48:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:59.757 spare_delay 00:22:59.757 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:00.015 [2024-07-12 13:48:48.369297] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:00.015 [2024-07-12 13:48:48.369349] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:00.015 [2024-07-12 13:48:48.369370] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27e7800 00:23:00.015 [2024-07-12 13:48:48.369383] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:00.015 [2024-07-12 13:48:48.370828] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:00.015 [2024-07-12 13:48:48.370857] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:00.015 spare 00:23:00.015 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:00.274 [2024-07-12 13:48:48.617971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:00.274 [2024-07-12 13:48:48.619213] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:00.274 [2024-07-12 13:48:48.619288] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27e89b0 00:23:00.274 [2024-07-12 13:48:48.619299] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:00.274 [2024-07-12 13:48:48.619511] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27e1dd0 00:23:00.274 [2024-07-12 13:48:48.619650] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27e89b0 00:23:00.274 [2024-07-12 13:48:48.619660] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27e89b0 00:23:00.274 [2024-07-12 13:48:48.619770] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:00.274 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:00.274 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:00.274 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:00.274 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:00.274 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:00.275 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:00.275 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:00.275 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:00.275 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:00.275 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:00.275 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.275 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.275 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:00.275 "name": "raid_bdev1", 00:23:00.275 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:00.275 "strip_size_kb": 0, 00:23:00.275 "state": "online", 00:23:00.275 "raid_level": "raid1", 00:23:00.275 "superblock": false, 00:23:00.275 "num_base_bdevs": 2, 00:23:00.275 "num_base_bdevs_discovered": 2, 00:23:00.275 "num_base_bdevs_operational": 2, 00:23:00.275 "base_bdevs_list": [ 00:23:00.275 { 00:23:00.275 "name": "BaseBdev1", 00:23:00.275 "uuid": "489ce36a-1d42-59d3-ae14-9a342dbae8e9", 00:23:00.275 "is_configured": true, 00:23:00.275 "data_offset": 0, 00:23:00.275 "data_size": 65536 00:23:00.275 }, 00:23:00.275 { 00:23:00.275 "name": "BaseBdev2", 00:23:00.275 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:00.275 "is_configured": true, 00:23:00.275 "data_offset": 0, 00:23:00.275 "data_size": 65536 00:23:00.275 } 00:23:00.275 ] 00:23:00.275 }' 00:23:00.275 13:48:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:00.275 13:48:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:00.842 13:48:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:00.842 13:48:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:01.100 [2024-07-12 13:48:49.540645] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:01.100 13:48:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:01.101 13:48:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.101 13:48:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:01.668 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:02.235 [2024-07-12 13:48:50.579205] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27e1dd0 00:23:02.235 /dev/nbd0 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:02.235 1+0 records in 00:23:02.235 1+0 records out 00:23:02.235 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248242 s, 16.5 MB/s 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:02.235 13:48:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:02.236 13:48:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:02.236 13:48:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:02.236 13:48:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:23:08.802 65536+0 records in 00:23:08.802 65536+0 records out 00:23:08.802 33554432 bytes (34 MB, 32 MiB) copied, 6.23787 s, 5.4 MB/s 00:23:08.802 13:48:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:08.802 13:48:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:08.802 13:48:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:08.802 13:48:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:08.802 13:48:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:08.802 13:48:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:08.802 13:48:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:08.802 [2024-07-12 13:48:57.151579] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:08.802 [2024-07-12 13:48:57.323230] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.802 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.060 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.060 "name": "raid_bdev1", 00:23:09.060 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:09.060 "strip_size_kb": 0, 00:23:09.060 "state": "online", 00:23:09.060 "raid_level": "raid1", 00:23:09.060 "superblock": false, 00:23:09.060 "num_base_bdevs": 2, 00:23:09.060 "num_base_bdevs_discovered": 1, 00:23:09.060 "num_base_bdevs_operational": 1, 00:23:09.060 "base_bdevs_list": [ 00:23:09.060 { 00:23:09.060 "name": null, 00:23:09.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.060 "is_configured": false, 00:23:09.060 "data_offset": 0, 00:23:09.060 "data_size": 65536 00:23:09.060 }, 00:23:09.060 { 00:23:09.060 "name": "BaseBdev2", 00:23:09.060 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:09.060 "is_configured": true, 00:23:09.060 "data_offset": 0, 00:23:09.060 "data_size": 65536 00:23:09.060 } 00:23:09.060 ] 00:23:09.060 }' 00:23:09.060 13:48:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.060 13:48:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:09.628 13:48:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:09.887 [2024-07-12 13:48:58.418153] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:09.887 [2024-07-12 13:48:58.423141] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27e91c0 00:23:09.887 [2024-07-12 13:48:58.425387] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:09.887 13:48:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:11.264 "name": "raid_bdev1", 00:23:11.264 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:11.264 "strip_size_kb": 0, 00:23:11.264 "state": "online", 00:23:11.264 "raid_level": "raid1", 00:23:11.264 "superblock": false, 00:23:11.264 "num_base_bdevs": 2, 00:23:11.264 "num_base_bdevs_discovered": 2, 00:23:11.264 "num_base_bdevs_operational": 2, 00:23:11.264 "process": { 00:23:11.264 "type": "rebuild", 00:23:11.264 "target": "spare", 00:23:11.264 "progress": { 00:23:11.264 "blocks": 24576, 00:23:11.264 "percent": 37 00:23:11.264 } 00:23:11.264 }, 00:23:11.264 "base_bdevs_list": [ 00:23:11.264 { 00:23:11.264 "name": "spare", 00:23:11.264 "uuid": "ad075708-25c0-594b-862f-cd40592a4f79", 00:23:11.264 "is_configured": true, 00:23:11.264 "data_offset": 0, 00:23:11.264 "data_size": 65536 00:23:11.264 }, 00:23:11.264 { 00:23:11.264 "name": "BaseBdev2", 00:23:11.264 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:11.264 "is_configured": true, 00:23:11.264 "data_offset": 0, 00:23:11.264 "data_size": 65536 00:23:11.264 } 00:23:11.264 ] 00:23:11.264 }' 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:11.264 13:48:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:11.522 [2024-07-12 13:49:00.007313] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:11.522 [2024-07-12 13:49:00.038208] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:11.522 [2024-07-12 13:49:00.038268] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:11.522 [2024-07-12 13:49:00.038284] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:11.522 [2024-07-12 13:49:00.038293] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:11.522 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:11.522 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.523 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:11.523 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.523 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.523 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:11.523 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.523 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.523 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.523 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.523 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.523 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.781 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.781 "name": "raid_bdev1", 00:23:11.781 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:11.781 "strip_size_kb": 0, 00:23:11.781 "state": "online", 00:23:11.781 "raid_level": "raid1", 00:23:11.781 "superblock": false, 00:23:11.781 "num_base_bdevs": 2, 00:23:11.781 "num_base_bdevs_discovered": 1, 00:23:11.781 "num_base_bdevs_operational": 1, 00:23:11.781 "base_bdevs_list": [ 00:23:11.781 { 00:23:11.781 "name": null, 00:23:11.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.781 "is_configured": false, 00:23:11.781 "data_offset": 0, 00:23:11.781 "data_size": 65536 00:23:11.781 }, 00:23:11.781 { 00:23:11.781 "name": "BaseBdev2", 00:23:11.781 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:11.781 "is_configured": true, 00:23:11.781 "data_offset": 0, 00:23:11.781 "data_size": 65536 00:23:11.781 } 00:23:11.781 ] 00:23:11.781 }' 00:23:11.781 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.781 13:49:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:12.348 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:12.348 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:12.348 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:12.348 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:12.348 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:12.348 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.348 13:49:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.606 13:49:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:12.606 "name": "raid_bdev1", 00:23:12.606 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:12.606 "strip_size_kb": 0, 00:23:12.606 "state": "online", 00:23:12.606 "raid_level": "raid1", 00:23:12.606 "superblock": false, 00:23:12.606 "num_base_bdevs": 2, 00:23:12.606 "num_base_bdevs_discovered": 1, 00:23:12.606 "num_base_bdevs_operational": 1, 00:23:12.606 "base_bdevs_list": [ 00:23:12.606 { 00:23:12.606 "name": null, 00:23:12.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.606 "is_configured": false, 00:23:12.606 "data_offset": 0, 00:23:12.606 "data_size": 65536 00:23:12.606 }, 00:23:12.606 { 00:23:12.606 "name": "BaseBdev2", 00:23:12.606 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:12.606 "is_configured": true, 00:23:12.606 "data_offset": 0, 00:23:12.606 "data_size": 65536 00:23:12.606 } 00:23:12.606 ] 00:23:12.606 }' 00:23:12.606 13:49:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:12.865 13:49:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:12.865 13:49:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:12.865 13:49:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:12.865 13:49:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:13.123 [2024-07-12 13:49:01.467314] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:13.123 [2024-07-12 13:49:01.472241] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27e1dd0 00:23:13.123 [2024-07-12 13:49:01.473727] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:13.123 13:49:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:14.057 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:14.057 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:14.057 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:14.057 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:14.057 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:14.057 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.057 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.316 "name": "raid_bdev1", 00:23:14.316 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:14.316 "strip_size_kb": 0, 00:23:14.316 "state": "online", 00:23:14.316 "raid_level": "raid1", 00:23:14.316 "superblock": false, 00:23:14.316 "num_base_bdevs": 2, 00:23:14.316 "num_base_bdevs_discovered": 2, 00:23:14.316 "num_base_bdevs_operational": 2, 00:23:14.316 "process": { 00:23:14.316 "type": "rebuild", 00:23:14.316 "target": "spare", 00:23:14.316 "progress": { 00:23:14.316 "blocks": 24576, 00:23:14.316 "percent": 37 00:23:14.316 } 00:23:14.316 }, 00:23:14.316 "base_bdevs_list": [ 00:23:14.316 { 00:23:14.316 "name": "spare", 00:23:14.316 "uuid": "ad075708-25c0-594b-862f-cd40592a4f79", 00:23:14.316 "is_configured": true, 00:23:14.316 "data_offset": 0, 00:23:14.316 "data_size": 65536 00:23:14.316 }, 00:23:14.316 { 00:23:14.316 "name": "BaseBdev2", 00:23:14.316 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:14.316 "is_configured": true, 00:23:14.316 "data_offset": 0, 00:23:14.316 "data_size": 65536 00:23:14.316 } 00:23:14.316 ] 00:23:14.316 }' 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=806 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.316 13:49:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.575 13:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.575 "name": "raid_bdev1", 00:23:14.575 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:14.575 "strip_size_kb": 0, 00:23:14.575 "state": "online", 00:23:14.575 "raid_level": "raid1", 00:23:14.575 "superblock": false, 00:23:14.575 "num_base_bdevs": 2, 00:23:14.575 "num_base_bdevs_discovered": 2, 00:23:14.575 "num_base_bdevs_operational": 2, 00:23:14.575 "process": { 00:23:14.575 "type": "rebuild", 00:23:14.575 "target": "spare", 00:23:14.575 "progress": { 00:23:14.575 "blocks": 30720, 00:23:14.575 "percent": 46 00:23:14.575 } 00:23:14.575 }, 00:23:14.575 "base_bdevs_list": [ 00:23:14.575 { 00:23:14.575 "name": "spare", 00:23:14.575 "uuid": "ad075708-25c0-594b-862f-cd40592a4f79", 00:23:14.575 "is_configured": true, 00:23:14.575 "data_offset": 0, 00:23:14.575 "data_size": 65536 00:23:14.575 }, 00:23:14.575 { 00:23:14.575 "name": "BaseBdev2", 00:23:14.575 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:14.575 "is_configured": true, 00:23:14.575 "data_offset": 0, 00:23:14.575 "data_size": 65536 00:23:14.575 } 00:23:14.575 ] 00:23:14.575 }' 00:23:14.575 13:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.575 13:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:14.575 13:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.833 13:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:14.833 13:49:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:15.767 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:15.767 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:15.767 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:15.767 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:15.767 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:15.767 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:15.767 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.767 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.025 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:16.025 "name": "raid_bdev1", 00:23:16.025 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:16.025 "strip_size_kb": 0, 00:23:16.025 "state": "online", 00:23:16.025 "raid_level": "raid1", 00:23:16.025 "superblock": false, 00:23:16.025 "num_base_bdevs": 2, 00:23:16.025 "num_base_bdevs_discovered": 2, 00:23:16.025 "num_base_bdevs_operational": 2, 00:23:16.025 "process": { 00:23:16.025 "type": "rebuild", 00:23:16.025 "target": "spare", 00:23:16.025 "progress": { 00:23:16.025 "blocks": 59392, 00:23:16.025 "percent": 90 00:23:16.025 } 00:23:16.025 }, 00:23:16.025 "base_bdevs_list": [ 00:23:16.025 { 00:23:16.025 "name": "spare", 00:23:16.025 "uuid": "ad075708-25c0-594b-862f-cd40592a4f79", 00:23:16.025 "is_configured": true, 00:23:16.025 "data_offset": 0, 00:23:16.025 "data_size": 65536 00:23:16.025 }, 00:23:16.025 { 00:23:16.025 "name": "BaseBdev2", 00:23:16.025 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:16.025 "is_configured": true, 00:23:16.025 "data_offset": 0, 00:23:16.025 "data_size": 65536 00:23:16.025 } 00:23:16.025 ] 00:23:16.025 }' 00:23:16.025 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:16.025 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:16.025 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:16.025 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:16.025 13:49:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:16.284 [2024-07-12 13:49:04.699139] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:16.284 [2024-07-12 13:49:04.699200] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:16.284 [2024-07-12 13:49:04.699238] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:17.218 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:17.218 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:17.218 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:17.218 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:17.218 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:17.218 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:17.218 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.218 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.218 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.218 "name": "raid_bdev1", 00:23:17.218 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:17.218 "strip_size_kb": 0, 00:23:17.218 "state": "online", 00:23:17.218 "raid_level": "raid1", 00:23:17.218 "superblock": false, 00:23:17.218 "num_base_bdevs": 2, 00:23:17.218 "num_base_bdevs_discovered": 2, 00:23:17.219 "num_base_bdevs_operational": 2, 00:23:17.219 "base_bdevs_list": [ 00:23:17.219 { 00:23:17.219 "name": "spare", 00:23:17.219 "uuid": "ad075708-25c0-594b-862f-cd40592a4f79", 00:23:17.219 "is_configured": true, 00:23:17.219 "data_offset": 0, 00:23:17.219 "data_size": 65536 00:23:17.219 }, 00:23:17.219 { 00:23:17.219 "name": "BaseBdev2", 00:23:17.219 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:17.219 "is_configured": true, 00:23:17.219 "data_offset": 0, 00:23:17.219 "data_size": 65536 00:23:17.219 } 00:23:17.219 ] 00:23:17.219 }' 00:23:17.219 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.219 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:17.219 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.477 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:17.477 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:23:17.477 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:17.477 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:17.477 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:17.477 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:17.477 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:17.477 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.477 13:49:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.736 "name": "raid_bdev1", 00:23:17.736 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:17.736 "strip_size_kb": 0, 00:23:17.736 "state": "online", 00:23:17.736 "raid_level": "raid1", 00:23:17.736 "superblock": false, 00:23:17.736 "num_base_bdevs": 2, 00:23:17.736 "num_base_bdevs_discovered": 2, 00:23:17.736 "num_base_bdevs_operational": 2, 00:23:17.736 "base_bdevs_list": [ 00:23:17.736 { 00:23:17.736 "name": "spare", 00:23:17.736 "uuid": "ad075708-25c0-594b-862f-cd40592a4f79", 00:23:17.736 "is_configured": true, 00:23:17.736 "data_offset": 0, 00:23:17.736 "data_size": 65536 00:23:17.736 }, 00:23:17.736 { 00:23:17.736 "name": "BaseBdev2", 00:23:17.736 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:17.736 "is_configured": true, 00:23:17.736 "data_offset": 0, 00:23:17.736 "data_size": 65536 00:23:17.736 } 00:23:17.736 ] 00:23:17.736 }' 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.736 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.995 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.995 "name": "raid_bdev1", 00:23:17.995 "uuid": "92c9ddd7-6b43-439b-9e72-0c8857149f5b", 00:23:17.995 "strip_size_kb": 0, 00:23:17.995 "state": "online", 00:23:17.995 "raid_level": "raid1", 00:23:17.995 "superblock": false, 00:23:17.995 "num_base_bdevs": 2, 00:23:17.995 "num_base_bdevs_discovered": 2, 00:23:17.995 "num_base_bdevs_operational": 2, 00:23:17.995 "base_bdevs_list": [ 00:23:17.995 { 00:23:17.995 "name": "spare", 00:23:17.995 "uuid": "ad075708-25c0-594b-862f-cd40592a4f79", 00:23:17.995 "is_configured": true, 00:23:17.995 "data_offset": 0, 00:23:17.995 "data_size": 65536 00:23:17.995 }, 00:23:17.995 { 00:23:17.995 "name": "BaseBdev2", 00:23:17.995 "uuid": "adbb97db-b8a4-5b99-ae62-8ca6d376f4fa", 00:23:17.995 "is_configured": true, 00:23:17.995 "data_offset": 0, 00:23:17.995 "data_size": 65536 00:23:17.995 } 00:23:17.995 ] 00:23:17.995 }' 00:23:17.995 13:49:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.995 13:49:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:18.562 13:49:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:18.820 [2024-07-12 13:49:07.254715] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:18.820 [2024-07-12 13:49:07.254745] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:18.820 [2024-07-12 13:49:07.254805] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:18.820 [2024-07-12 13:49:07.254862] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:18.820 [2024-07-12 13:49:07.254874] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27e89b0 name raid_bdev1, state offline 00:23:18.820 13:49:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.820 13:49:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:19.078 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:19.337 /dev/nbd0 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:19.337 1+0 records in 00:23:19.337 1+0 records out 00:23:19.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239853 s, 17.1 MB/s 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:19.337 13:49:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:19.594 /dev/nbd1 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:19.594 1+0 records in 00:23:19.594 1+0 records out 00:23:19.594 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000305214 s, 13.4 MB/s 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:19.594 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:19.595 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:19.595 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:19.595 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:19.595 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:19.852 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:20.110 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:20.110 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:20.110 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:20.110 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:20.110 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:20.110 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:20.110 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:20.110 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:20.110 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:20.110 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 538511 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 538511 ']' 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 538511 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 538511 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 538511' 00:23:20.368 killing process with pid 538511 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 538511 00:23:20.368 Received shutdown signal, test time was about 60.000000 seconds 00:23:20.368 00:23:20.368 Latency(us) 00:23:20.368 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:20.368 =================================================================================================================== 00:23:20.368 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:20.368 [2024-07-12 13:49:08.773777] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:20.368 13:49:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 538511 00:23:20.368 [2024-07-12 13:49:08.800121] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:23:20.626 00:23:20.626 real 0m23.259s 00:23:20.626 user 0m30.544s 00:23:20.626 sys 0m5.641s 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:20.626 ************************************ 00:23:20.626 END TEST raid_rebuild_test 00:23:20.626 ************************************ 00:23:20.626 13:49:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:20.626 13:49:09 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:23:20.626 13:49:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:20.626 13:49:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:20.626 13:49:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:20.626 ************************************ 00:23:20.626 START TEST raid_rebuild_test_sb 00:23:20.626 ************************************ 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=541740 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 541740 /var/tmp/spdk-raid.sock 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 541740 ']' 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:20.626 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:20.626 13:49:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:20.626 [2024-07-12 13:49:09.161531] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:23:20.626 [2024-07-12 13:49:09.161595] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid541740 ] 00:23:20.626 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:20.626 Zero copy mechanism will not be used. 00:23:20.884 [2024-07-12 13:49:09.291211] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:20.884 [2024-07-12 13:49:09.394954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:20.884 [2024-07-12 13:49:09.455266] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:20.884 [2024-07-12 13:49:09.455332] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:21.852 13:49:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:21.852 13:49:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:23:21.852 13:49:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:21.852 13:49:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:21.852 BaseBdev1_malloc 00:23:21.852 13:49:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:22.148 [2024-07-12 13:49:10.637335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:22.148 [2024-07-12 13:49:10.637385] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:22.148 [2024-07-12 13:49:10.637409] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x926680 00:23:22.148 [2024-07-12 13:49:10.637423] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:22.148 [2024-07-12 13:49:10.639191] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:22.148 [2024-07-12 13:49:10.639220] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:22.148 BaseBdev1 00:23:22.148 13:49:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:22.148 13:49:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:22.421 BaseBdev2_malloc 00:23:22.421 13:49:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:22.708 [2024-07-12 13:49:11.127521] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:22.708 [2024-07-12 13:49:11.127568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:22.708 [2024-07-12 13:49:11.127591] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9271a0 00:23:22.708 [2024-07-12 13:49:11.127604] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:22.708 [2024-07-12 13:49:11.129189] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:22.708 [2024-07-12 13:49:11.129217] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:22.708 BaseBdev2 00:23:22.708 13:49:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:22.972 spare_malloc 00:23:22.972 13:49:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:23.231 spare_delay 00:23:23.231 13:49:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:23.490 [2024-07-12 13:49:11.866021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:23.490 [2024-07-12 13:49:11.866067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:23.490 [2024-07-12 13:49:11.866088] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad5800 00:23:23.490 [2024-07-12 13:49:11.866106] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:23.490 [2024-07-12 13:49:11.867727] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:23.490 [2024-07-12 13:49:11.867755] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:23.490 spare 00:23:23.490 13:49:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:23.749 [2024-07-12 13:49:12.098656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:23.749 [2024-07-12 13:49:12.100002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:23.749 [2024-07-12 13:49:12.100168] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xad69b0 00:23:23.749 [2024-07-12 13:49:12.100181] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:23.749 [2024-07-12 13:49:12.100379] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xacfdd0 00:23:23.749 [2024-07-12 13:49:12.100519] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xad69b0 00:23:23.749 [2024-07-12 13:49:12.100530] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xad69b0 00:23:23.749 [2024-07-12 13:49:12.100628] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.749 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.008 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:24.008 "name": "raid_bdev1", 00:23:24.008 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:24.008 "strip_size_kb": 0, 00:23:24.008 "state": "online", 00:23:24.008 "raid_level": "raid1", 00:23:24.008 "superblock": true, 00:23:24.008 "num_base_bdevs": 2, 00:23:24.008 "num_base_bdevs_discovered": 2, 00:23:24.008 "num_base_bdevs_operational": 2, 00:23:24.008 "base_bdevs_list": [ 00:23:24.008 { 00:23:24.008 "name": "BaseBdev1", 00:23:24.008 "uuid": "9ea42ba0-8878-536e-aebc-9f8d4d09b200", 00:23:24.008 "is_configured": true, 00:23:24.008 "data_offset": 2048, 00:23:24.008 "data_size": 63488 00:23:24.008 }, 00:23:24.008 { 00:23:24.008 "name": "BaseBdev2", 00:23:24.008 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:24.008 "is_configured": true, 00:23:24.008 "data_offset": 2048, 00:23:24.008 "data_size": 63488 00:23:24.008 } 00:23:24.008 ] 00:23:24.008 }' 00:23:24.008 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:24.008 13:49:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:24.576 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:24.576 13:49:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:24.836 [2024-07-12 13:49:13.193774] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:24.836 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:24.836 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.836 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:25.094 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:25.353 [2024-07-12 13:49:13.686867] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xacfdd0 00:23:25.353 /dev/nbd0 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:25.353 1+0 records in 00:23:25.353 1+0 records out 00:23:25.353 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254736 s, 16.1 MB/s 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:25.353 13:49:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:23:31.916 63488+0 records in 00:23:31.916 63488+0 records out 00:23:31.916 32505856 bytes (33 MB, 31 MiB) copied, 5.74191 s, 5.7 MB/s 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:31.916 [2024-07-12 13:49:19.762720] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:31.916 13:49:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:31.916 [2024-07-12 13:49:19.998983] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.916 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.917 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:31.917 "name": "raid_bdev1", 00:23:31.917 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:31.917 "strip_size_kb": 0, 00:23:31.917 "state": "online", 00:23:31.917 "raid_level": "raid1", 00:23:31.917 "superblock": true, 00:23:31.917 "num_base_bdevs": 2, 00:23:31.917 "num_base_bdevs_discovered": 1, 00:23:31.917 "num_base_bdevs_operational": 1, 00:23:31.917 "base_bdevs_list": [ 00:23:31.917 { 00:23:31.917 "name": null, 00:23:31.917 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:31.917 "is_configured": false, 00:23:31.917 "data_offset": 2048, 00:23:31.917 "data_size": 63488 00:23:31.917 }, 00:23:31.917 { 00:23:31.917 "name": "BaseBdev2", 00:23:31.917 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:31.917 "is_configured": true, 00:23:31.917 "data_offset": 2048, 00:23:31.917 "data_size": 63488 00:23:31.917 } 00:23:31.917 ] 00:23:31.917 }' 00:23:31.917 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:31.917 13:49:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:32.482 13:49:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:32.740 [2024-07-12 13:49:21.097902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:32.740 [2024-07-12 13:49:21.102836] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xad6620 00:23:32.740 [2024-07-12 13:49:21.105059] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:32.740 13:49:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:33.674 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:33.674 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:33.674 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:33.674 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:33.674 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:33.674 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.674 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.932 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:33.932 "name": "raid_bdev1", 00:23:33.932 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:33.932 "strip_size_kb": 0, 00:23:33.932 "state": "online", 00:23:33.932 "raid_level": "raid1", 00:23:33.932 "superblock": true, 00:23:33.932 "num_base_bdevs": 2, 00:23:33.932 "num_base_bdevs_discovered": 2, 00:23:33.932 "num_base_bdevs_operational": 2, 00:23:33.932 "process": { 00:23:33.932 "type": "rebuild", 00:23:33.932 "target": "spare", 00:23:33.932 "progress": { 00:23:33.932 "blocks": 24576, 00:23:33.932 "percent": 38 00:23:33.932 } 00:23:33.932 }, 00:23:33.932 "base_bdevs_list": [ 00:23:33.932 { 00:23:33.932 "name": "spare", 00:23:33.932 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:33.932 "is_configured": true, 00:23:33.932 "data_offset": 2048, 00:23:33.932 "data_size": 63488 00:23:33.932 }, 00:23:33.932 { 00:23:33.932 "name": "BaseBdev2", 00:23:33.932 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:33.932 "is_configured": true, 00:23:33.932 "data_offset": 2048, 00:23:33.932 "data_size": 63488 00:23:33.932 } 00:23:33.932 ] 00:23:33.932 }' 00:23:33.932 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:33.932 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:33.932 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:33.932 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:33.932 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:34.190 [2024-07-12 13:49:22.683724] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:34.190 [2024-07-12 13:49:22.717670] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:34.190 [2024-07-12 13:49:22.717715] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:34.190 [2024-07-12 13:49:22.717731] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:34.190 [2024-07-12 13:49:22.717739] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.190 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.448 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.448 "name": "raid_bdev1", 00:23:34.448 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:34.448 "strip_size_kb": 0, 00:23:34.448 "state": "online", 00:23:34.448 "raid_level": "raid1", 00:23:34.448 "superblock": true, 00:23:34.448 "num_base_bdevs": 2, 00:23:34.448 "num_base_bdevs_discovered": 1, 00:23:34.448 "num_base_bdevs_operational": 1, 00:23:34.448 "base_bdevs_list": [ 00:23:34.448 { 00:23:34.448 "name": null, 00:23:34.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.448 "is_configured": false, 00:23:34.448 "data_offset": 2048, 00:23:34.448 "data_size": 63488 00:23:34.448 }, 00:23:34.448 { 00:23:34.448 "name": "BaseBdev2", 00:23:34.448 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:34.448 "is_configured": true, 00:23:34.448 "data_offset": 2048, 00:23:34.448 "data_size": 63488 00:23:34.448 } 00:23:34.448 ] 00:23:34.448 }' 00:23:34.448 13:49:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.448 13:49:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:35.013 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:35.013 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.013 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:35.013 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:35.013 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.013 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.013 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.271 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:35.271 "name": "raid_bdev1", 00:23:35.271 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:35.271 "strip_size_kb": 0, 00:23:35.271 "state": "online", 00:23:35.271 "raid_level": "raid1", 00:23:35.271 "superblock": true, 00:23:35.271 "num_base_bdevs": 2, 00:23:35.271 "num_base_bdevs_discovered": 1, 00:23:35.271 "num_base_bdevs_operational": 1, 00:23:35.271 "base_bdevs_list": [ 00:23:35.271 { 00:23:35.271 "name": null, 00:23:35.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.271 "is_configured": false, 00:23:35.271 "data_offset": 2048, 00:23:35.271 "data_size": 63488 00:23:35.271 }, 00:23:35.271 { 00:23:35.271 "name": "BaseBdev2", 00:23:35.271 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:35.271 "is_configured": true, 00:23:35.271 "data_offset": 2048, 00:23:35.271 "data_size": 63488 00:23:35.271 } 00:23:35.271 ] 00:23:35.271 }' 00:23:35.271 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:35.529 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:35.529 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.529 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:35.529 13:49:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:35.788 [2024-07-12 13:49:24.121729] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:35.788 [2024-07-12 13:49:24.127366] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xad6620 00:23:35.788 [2024-07-12 13:49:24.128861] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:35.788 13:49:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:36.723 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:36.723 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.723 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:36.723 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:36.723 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.723 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.723 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.982 "name": "raid_bdev1", 00:23:36.982 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:36.982 "strip_size_kb": 0, 00:23:36.982 "state": "online", 00:23:36.982 "raid_level": "raid1", 00:23:36.982 "superblock": true, 00:23:36.982 "num_base_bdevs": 2, 00:23:36.982 "num_base_bdevs_discovered": 2, 00:23:36.982 "num_base_bdevs_operational": 2, 00:23:36.982 "process": { 00:23:36.982 "type": "rebuild", 00:23:36.982 "target": "spare", 00:23:36.982 "progress": { 00:23:36.982 "blocks": 24576, 00:23:36.982 "percent": 38 00:23:36.982 } 00:23:36.982 }, 00:23:36.982 "base_bdevs_list": [ 00:23:36.982 { 00:23:36.982 "name": "spare", 00:23:36.982 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:36.982 "is_configured": true, 00:23:36.982 "data_offset": 2048, 00:23:36.982 "data_size": 63488 00:23:36.982 }, 00:23:36.982 { 00:23:36.982 "name": "BaseBdev2", 00:23:36.982 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:36.982 "is_configured": true, 00:23:36.982 "data_offset": 2048, 00:23:36.982 "data_size": 63488 00:23:36.982 } 00:23:36.982 ] 00:23:36.982 }' 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:36.982 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=829 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.982 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.240 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:37.240 "name": "raid_bdev1", 00:23:37.240 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:37.240 "strip_size_kb": 0, 00:23:37.240 "state": "online", 00:23:37.240 "raid_level": "raid1", 00:23:37.240 "superblock": true, 00:23:37.240 "num_base_bdevs": 2, 00:23:37.240 "num_base_bdevs_discovered": 2, 00:23:37.240 "num_base_bdevs_operational": 2, 00:23:37.240 "process": { 00:23:37.240 "type": "rebuild", 00:23:37.240 "target": "spare", 00:23:37.240 "progress": { 00:23:37.240 "blocks": 30720, 00:23:37.240 "percent": 48 00:23:37.240 } 00:23:37.240 }, 00:23:37.240 "base_bdevs_list": [ 00:23:37.240 { 00:23:37.240 "name": "spare", 00:23:37.240 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:37.240 "is_configured": true, 00:23:37.240 "data_offset": 2048, 00:23:37.240 "data_size": 63488 00:23:37.240 }, 00:23:37.240 { 00:23:37.240 "name": "BaseBdev2", 00:23:37.240 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:37.240 "is_configured": true, 00:23:37.240 "data_offset": 2048, 00:23:37.240 "data_size": 63488 00:23:37.240 } 00:23:37.240 ] 00:23:37.240 }' 00:23:37.240 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:37.240 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:37.240 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:37.498 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:37.498 13:49:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:38.434 13:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:38.434 13:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:38.434 13:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:38.434 13:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:38.434 13:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:38.434 13:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:38.434 13:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.434 13:49:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.694 13:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:38.694 "name": "raid_bdev1", 00:23:38.694 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:38.694 "strip_size_kb": 0, 00:23:38.694 "state": "online", 00:23:38.694 "raid_level": "raid1", 00:23:38.694 "superblock": true, 00:23:38.694 "num_base_bdevs": 2, 00:23:38.694 "num_base_bdevs_discovered": 2, 00:23:38.694 "num_base_bdevs_operational": 2, 00:23:38.694 "process": { 00:23:38.694 "type": "rebuild", 00:23:38.694 "target": "spare", 00:23:38.694 "progress": { 00:23:38.694 "blocks": 59392, 00:23:38.694 "percent": 93 00:23:38.694 } 00:23:38.694 }, 00:23:38.694 "base_bdevs_list": [ 00:23:38.694 { 00:23:38.694 "name": "spare", 00:23:38.694 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:38.694 "is_configured": true, 00:23:38.694 "data_offset": 2048, 00:23:38.694 "data_size": 63488 00:23:38.694 }, 00:23:38.694 { 00:23:38.694 "name": "BaseBdev2", 00:23:38.694 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:38.694 "is_configured": true, 00:23:38.694 "data_offset": 2048, 00:23:38.694 "data_size": 63488 00:23:38.694 } 00:23:38.694 ] 00:23:38.694 }' 00:23:38.694 13:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:38.695 13:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:38.695 13:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:38.695 13:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:38.695 13:49:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:38.695 [2024-07-12 13:49:27.252695] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:38.695 [2024-07-12 13:49:27.252751] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:38.695 [2024-07-12 13:49:27.252828] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.630 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:39.630 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:39.630 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:39.630 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:39.630 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:39.630 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:39.630 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.630 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.889 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:39.889 "name": "raid_bdev1", 00:23:39.889 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:39.889 "strip_size_kb": 0, 00:23:39.889 "state": "online", 00:23:39.889 "raid_level": "raid1", 00:23:39.889 "superblock": true, 00:23:39.889 "num_base_bdevs": 2, 00:23:39.889 "num_base_bdevs_discovered": 2, 00:23:39.889 "num_base_bdevs_operational": 2, 00:23:39.889 "base_bdevs_list": [ 00:23:39.889 { 00:23:39.889 "name": "spare", 00:23:39.889 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:39.889 "is_configured": true, 00:23:39.889 "data_offset": 2048, 00:23:39.889 "data_size": 63488 00:23:39.889 }, 00:23:39.889 { 00:23:39.889 "name": "BaseBdev2", 00:23:39.889 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:39.889 "is_configured": true, 00:23:39.889 "data_offset": 2048, 00:23:39.889 "data_size": 63488 00:23:39.889 } 00:23:39.889 ] 00:23:39.889 }' 00:23:39.889 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.147 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:40.406 "name": "raid_bdev1", 00:23:40.406 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:40.406 "strip_size_kb": 0, 00:23:40.406 "state": "online", 00:23:40.406 "raid_level": "raid1", 00:23:40.406 "superblock": true, 00:23:40.406 "num_base_bdevs": 2, 00:23:40.406 "num_base_bdevs_discovered": 2, 00:23:40.406 "num_base_bdevs_operational": 2, 00:23:40.406 "base_bdevs_list": [ 00:23:40.406 { 00:23:40.406 "name": "spare", 00:23:40.406 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:40.406 "is_configured": true, 00:23:40.406 "data_offset": 2048, 00:23:40.406 "data_size": 63488 00:23:40.406 }, 00:23:40.406 { 00:23:40.406 "name": "BaseBdev2", 00:23:40.406 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:40.406 "is_configured": true, 00:23:40.406 "data_offset": 2048, 00:23:40.406 "data_size": 63488 00:23:40.406 } 00:23:40.406 ] 00:23:40.406 }' 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.406 13:49:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.665 13:49:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.665 "name": "raid_bdev1", 00:23:40.665 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:40.665 "strip_size_kb": 0, 00:23:40.665 "state": "online", 00:23:40.665 "raid_level": "raid1", 00:23:40.665 "superblock": true, 00:23:40.665 "num_base_bdevs": 2, 00:23:40.665 "num_base_bdevs_discovered": 2, 00:23:40.665 "num_base_bdevs_operational": 2, 00:23:40.665 "base_bdevs_list": [ 00:23:40.665 { 00:23:40.665 "name": "spare", 00:23:40.665 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:40.665 "is_configured": true, 00:23:40.665 "data_offset": 2048, 00:23:40.665 "data_size": 63488 00:23:40.665 }, 00:23:40.665 { 00:23:40.665 "name": "BaseBdev2", 00:23:40.665 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:40.665 "is_configured": true, 00:23:40.665 "data_offset": 2048, 00:23:40.665 "data_size": 63488 00:23:40.665 } 00:23:40.665 ] 00:23:40.665 }' 00:23:40.665 13:49:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.665 13:49:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:41.232 13:49:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:41.491 [2024-07-12 13:49:29.864245] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:41.491 [2024-07-12 13:49:29.864280] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:41.491 [2024-07-12 13:49:29.864338] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:41.491 [2024-07-12 13:49:29.864392] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:41.491 [2024-07-12 13:49:29.864404] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xad69b0 name raid_bdev1, state offline 00:23:41.491 13:49:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.491 13:49:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:41.750 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:42.009 /dev/nbd0 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:42.009 1+0 records in 00:23:42.009 1+0 records out 00:23:42.009 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289411 s, 14.2 MB/s 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:42.009 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:42.268 /dev/nbd1 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:42.268 1+0 records in 00:23:42.268 1+0 records out 00:23:42.268 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294811 s, 13.9 MB/s 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:42.268 13:49:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:42.527 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:42.527 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:42.527 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:42.527 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:42.527 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:42.527 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:42.527 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:42.527 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:42.527 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:42.527 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:43.103 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:43.362 [2024-07-12 13:49:31.805545] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:43.362 [2024-07-12 13:49:31.805588] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:43.362 [2024-07-12 13:49:31.805609] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad5e40 00:23:43.362 [2024-07-12 13:49:31.805622] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:43.362 [2024-07-12 13:49:31.807260] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:43.362 [2024-07-12 13:49:31.807288] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:43.362 [2024-07-12 13:49:31.807368] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:43.362 [2024-07-12 13:49:31.807393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:43.362 [2024-07-12 13:49:31.807491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:43.362 spare 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.362 13:49:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.362 [2024-07-12 13:49:31.907802] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xad4ba0 00:23:43.362 [2024-07-12 13:49:31.907820] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:43.362 [2024-07-12 13:49:31.908023] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xacfdd0 00:23:43.362 [2024-07-12 13:49:31.908165] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xad4ba0 00:23:43.362 [2024-07-12 13:49:31.908176] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xad4ba0 00:23:43.362 [2024-07-12 13:49:31.908275] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:43.620 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:43.620 "name": "raid_bdev1", 00:23:43.620 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:43.620 "strip_size_kb": 0, 00:23:43.620 "state": "online", 00:23:43.620 "raid_level": "raid1", 00:23:43.620 "superblock": true, 00:23:43.620 "num_base_bdevs": 2, 00:23:43.620 "num_base_bdevs_discovered": 2, 00:23:43.620 "num_base_bdevs_operational": 2, 00:23:43.620 "base_bdevs_list": [ 00:23:43.620 { 00:23:43.620 "name": "spare", 00:23:43.621 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:43.621 "is_configured": true, 00:23:43.621 "data_offset": 2048, 00:23:43.621 "data_size": 63488 00:23:43.621 }, 00:23:43.621 { 00:23:43.621 "name": "BaseBdev2", 00:23:43.621 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:43.621 "is_configured": true, 00:23:43.621 "data_offset": 2048, 00:23:43.621 "data_size": 63488 00:23:43.621 } 00:23:43.621 ] 00:23:43.621 }' 00:23:43.621 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:43.621 13:49:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:44.187 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:44.187 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:44.187 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:44.187 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:44.188 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:44.188 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.188 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:44.446 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:44.446 "name": "raid_bdev1", 00:23:44.446 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:44.446 "strip_size_kb": 0, 00:23:44.446 "state": "online", 00:23:44.446 "raid_level": "raid1", 00:23:44.446 "superblock": true, 00:23:44.446 "num_base_bdevs": 2, 00:23:44.446 "num_base_bdevs_discovered": 2, 00:23:44.446 "num_base_bdevs_operational": 2, 00:23:44.446 "base_bdevs_list": [ 00:23:44.446 { 00:23:44.446 "name": "spare", 00:23:44.446 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:44.446 "is_configured": true, 00:23:44.446 "data_offset": 2048, 00:23:44.446 "data_size": 63488 00:23:44.446 }, 00:23:44.446 { 00:23:44.446 "name": "BaseBdev2", 00:23:44.446 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:44.446 "is_configured": true, 00:23:44.446 "data_offset": 2048, 00:23:44.446 "data_size": 63488 00:23:44.446 } 00:23:44.446 ] 00:23:44.446 }' 00:23:44.446 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:44.446 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:44.446 13:49:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:44.446 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:44.446 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.446 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:44.704 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:44.704 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:44.963 [2024-07-12 13:49:33.498178] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.963 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.221 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:45.221 "name": "raid_bdev1", 00:23:45.221 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:45.221 "strip_size_kb": 0, 00:23:45.221 "state": "online", 00:23:45.221 "raid_level": "raid1", 00:23:45.221 "superblock": true, 00:23:45.221 "num_base_bdevs": 2, 00:23:45.221 "num_base_bdevs_discovered": 1, 00:23:45.221 "num_base_bdevs_operational": 1, 00:23:45.221 "base_bdevs_list": [ 00:23:45.221 { 00:23:45.221 "name": null, 00:23:45.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:45.221 "is_configured": false, 00:23:45.221 "data_offset": 2048, 00:23:45.221 "data_size": 63488 00:23:45.221 }, 00:23:45.221 { 00:23:45.221 "name": "BaseBdev2", 00:23:45.221 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:45.221 "is_configured": true, 00:23:45.221 "data_offset": 2048, 00:23:45.221 "data_size": 63488 00:23:45.221 } 00:23:45.221 ] 00:23:45.221 }' 00:23:45.221 13:49:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:45.221 13:49:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:45.788 13:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:46.046 [2024-07-12 13:49:34.577049] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:46.046 [2024-07-12 13:49:34.577194] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:46.046 [2024-07-12 13:49:34.577210] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:46.046 [2024-07-12 13:49:34.577238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:46.046 [2024-07-12 13:49:34.582035] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xacfdd0 00:23:46.046 [2024-07-12 13:49:34.584352] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:46.046 13:49:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:47.423 "name": "raid_bdev1", 00:23:47.423 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:47.423 "strip_size_kb": 0, 00:23:47.423 "state": "online", 00:23:47.423 "raid_level": "raid1", 00:23:47.423 "superblock": true, 00:23:47.423 "num_base_bdevs": 2, 00:23:47.423 "num_base_bdevs_discovered": 2, 00:23:47.423 "num_base_bdevs_operational": 2, 00:23:47.423 "process": { 00:23:47.423 "type": "rebuild", 00:23:47.423 "target": "spare", 00:23:47.423 "progress": { 00:23:47.423 "blocks": 24576, 00:23:47.423 "percent": 38 00:23:47.423 } 00:23:47.423 }, 00:23:47.423 "base_bdevs_list": [ 00:23:47.423 { 00:23:47.423 "name": "spare", 00:23:47.423 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:47.423 "is_configured": true, 00:23:47.423 "data_offset": 2048, 00:23:47.423 "data_size": 63488 00:23:47.423 }, 00:23:47.423 { 00:23:47.423 "name": "BaseBdev2", 00:23:47.423 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:47.423 "is_configured": true, 00:23:47.423 "data_offset": 2048, 00:23:47.423 "data_size": 63488 00:23:47.423 } 00:23:47.423 ] 00:23:47.423 }' 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:47.423 13:49:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:47.682 [2024-07-12 13:49:36.162631] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:47.682 [2024-07-12 13:49:36.196444] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:47.682 [2024-07-12 13:49:36.196486] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:47.682 [2024-07-12 13:49:36.196502] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:47.682 [2024-07-12 13:49:36.196510] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.682 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.941 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:47.941 "name": "raid_bdev1", 00:23:47.941 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:47.942 "strip_size_kb": 0, 00:23:47.942 "state": "online", 00:23:47.942 "raid_level": "raid1", 00:23:47.942 "superblock": true, 00:23:47.942 "num_base_bdevs": 2, 00:23:47.942 "num_base_bdevs_discovered": 1, 00:23:47.942 "num_base_bdevs_operational": 1, 00:23:47.942 "base_bdevs_list": [ 00:23:47.942 { 00:23:47.942 "name": null, 00:23:47.942 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:47.942 "is_configured": false, 00:23:47.942 "data_offset": 2048, 00:23:47.942 "data_size": 63488 00:23:47.942 }, 00:23:47.942 { 00:23:47.942 "name": "BaseBdev2", 00:23:47.942 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:47.942 "is_configured": true, 00:23:47.942 "data_offset": 2048, 00:23:47.942 "data_size": 63488 00:23:47.942 } 00:23:47.942 ] 00:23:47.942 }' 00:23:47.942 13:49:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:47.942 13:49:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:48.509 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:48.769 [2024-07-12 13:49:37.243623] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:48.769 [2024-07-12 13:49:37.243673] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:48.769 [2024-07-12 13:49:37.243695] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad6070 00:23:48.769 [2024-07-12 13:49:37.243708] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:48.769 [2024-07-12 13:49:37.244085] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:48.769 [2024-07-12 13:49:37.244103] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:48.769 [2024-07-12 13:49:37.244180] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:48.769 [2024-07-12 13:49:37.244193] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:48.769 [2024-07-12 13:49:37.244204] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:48.769 [2024-07-12 13:49:37.244222] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:48.769 [2024-07-12 13:49:37.249112] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xad73e0 00:23:48.769 [2024-07-12 13:49:37.250564] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:48.769 spare 00:23:48.769 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:49.707 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:49.707 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:49.707 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:49.707 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:49.707 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:49.707 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.707 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.965 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:49.965 "name": "raid_bdev1", 00:23:49.965 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:49.965 "strip_size_kb": 0, 00:23:49.965 "state": "online", 00:23:49.965 "raid_level": "raid1", 00:23:49.965 "superblock": true, 00:23:49.965 "num_base_bdevs": 2, 00:23:49.965 "num_base_bdevs_discovered": 2, 00:23:49.965 "num_base_bdevs_operational": 2, 00:23:49.965 "process": { 00:23:49.965 "type": "rebuild", 00:23:49.965 "target": "spare", 00:23:49.965 "progress": { 00:23:49.965 "blocks": 22528, 00:23:49.965 "percent": 35 00:23:49.965 } 00:23:49.965 }, 00:23:49.965 "base_bdevs_list": [ 00:23:49.965 { 00:23:49.965 "name": "spare", 00:23:49.965 "uuid": "4fb8519d-bd22-5955-84cb-2f9ed6ccd240", 00:23:49.965 "is_configured": true, 00:23:49.965 "data_offset": 2048, 00:23:49.965 "data_size": 63488 00:23:49.965 }, 00:23:49.965 { 00:23:49.965 "name": "BaseBdev2", 00:23:49.965 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:49.965 "is_configured": true, 00:23:49.965 "data_offset": 2048, 00:23:49.965 "data_size": 63488 00:23:49.965 } 00:23:49.965 ] 00:23:49.965 }' 00:23:49.965 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:49.965 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:49.965 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:50.225 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:50.225 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:50.225 [2024-07-12 13:49:38.773556] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:50.484 [2024-07-12 13:49:38.863189] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:50.484 [2024-07-12 13:49:38.863234] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:50.484 [2024-07-12 13:49:38.863249] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:50.484 [2024-07-12 13:49:38.863258] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.484 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.743 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:50.743 "name": "raid_bdev1", 00:23:50.743 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:50.743 "strip_size_kb": 0, 00:23:50.743 "state": "online", 00:23:50.743 "raid_level": "raid1", 00:23:50.743 "superblock": true, 00:23:50.743 "num_base_bdevs": 2, 00:23:50.743 "num_base_bdevs_discovered": 1, 00:23:50.743 "num_base_bdevs_operational": 1, 00:23:50.743 "base_bdevs_list": [ 00:23:50.743 { 00:23:50.743 "name": null, 00:23:50.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.743 "is_configured": false, 00:23:50.743 "data_offset": 2048, 00:23:50.743 "data_size": 63488 00:23:50.743 }, 00:23:50.743 { 00:23:50.743 "name": "BaseBdev2", 00:23:50.743 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:50.743 "is_configured": true, 00:23:50.743 "data_offset": 2048, 00:23:50.743 "data_size": 63488 00:23:50.743 } 00:23:50.743 ] 00:23:50.743 }' 00:23:50.743 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:50.743 13:49:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:51.311 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:51.311 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.311 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:51.311 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:51.311 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.311 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.311 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.571 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:51.571 "name": "raid_bdev1", 00:23:51.571 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:51.571 "strip_size_kb": 0, 00:23:51.571 "state": "online", 00:23:51.571 "raid_level": "raid1", 00:23:51.571 "superblock": true, 00:23:51.571 "num_base_bdevs": 2, 00:23:51.571 "num_base_bdevs_discovered": 1, 00:23:51.571 "num_base_bdevs_operational": 1, 00:23:51.571 "base_bdevs_list": [ 00:23:51.571 { 00:23:51.571 "name": null, 00:23:51.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.571 "is_configured": false, 00:23:51.571 "data_offset": 2048, 00:23:51.571 "data_size": 63488 00:23:51.571 }, 00:23:51.571 { 00:23:51.571 "name": "BaseBdev2", 00:23:51.571 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:51.571 "is_configured": true, 00:23:51.571 "data_offset": 2048, 00:23:51.571 "data_size": 63488 00:23:51.571 } 00:23:51.571 ] 00:23:51.571 }' 00:23:51.571 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:51.571 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:51.571 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:51.571 13:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:51.571 13:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:51.831 13:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:52.089 [2024-07-12 13:49:40.495951] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:52.089 [2024-07-12 13:49:40.496000] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:52.089 [2024-07-12 13:49:40.496022] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad0f90 00:23:52.089 [2024-07-12 13:49:40.496034] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:52.090 [2024-07-12 13:49:40.496387] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:52.090 [2024-07-12 13:49:40.496405] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:52.090 [2024-07-12 13:49:40.496467] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:52.090 [2024-07-12 13:49:40.496479] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:52.090 [2024-07-12 13:49:40.496489] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:52.090 BaseBdev1 00:23:52.090 13:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.027 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.286 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:53.286 "name": "raid_bdev1", 00:23:53.286 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:53.286 "strip_size_kb": 0, 00:23:53.286 "state": "online", 00:23:53.286 "raid_level": "raid1", 00:23:53.286 "superblock": true, 00:23:53.286 "num_base_bdevs": 2, 00:23:53.286 "num_base_bdevs_discovered": 1, 00:23:53.286 "num_base_bdevs_operational": 1, 00:23:53.286 "base_bdevs_list": [ 00:23:53.286 { 00:23:53.286 "name": null, 00:23:53.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:53.286 "is_configured": false, 00:23:53.286 "data_offset": 2048, 00:23:53.286 "data_size": 63488 00:23:53.286 }, 00:23:53.286 { 00:23:53.286 "name": "BaseBdev2", 00:23:53.286 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:53.286 "is_configured": true, 00:23:53.286 "data_offset": 2048, 00:23:53.286 "data_size": 63488 00:23:53.286 } 00:23:53.286 ] 00:23:53.286 }' 00:23:53.286 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:53.286 13:49:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:53.855 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:53.855 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:53.855 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:53.855 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:53.855 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:53.855 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.855 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.113 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:54.113 "name": "raid_bdev1", 00:23:54.113 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:54.113 "strip_size_kb": 0, 00:23:54.113 "state": "online", 00:23:54.113 "raid_level": "raid1", 00:23:54.113 "superblock": true, 00:23:54.113 "num_base_bdevs": 2, 00:23:54.113 "num_base_bdevs_discovered": 1, 00:23:54.113 "num_base_bdevs_operational": 1, 00:23:54.113 "base_bdevs_list": [ 00:23:54.113 { 00:23:54.113 "name": null, 00:23:54.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:54.113 "is_configured": false, 00:23:54.113 "data_offset": 2048, 00:23:54.113 "data_size": 63488 00:23:54.113 }, 00:23:54.113 { 00:23:54.113 "name": "BaseBdev2", 00:23:54.113 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:54.113 "is_configured": true, 00:23:54.113 "data_offset": 2048, 00:23:54.113 "data_size": 63488 00:23:54.113 } 00:23:54.113 ] 00:23:54.113 }' 00:23:54.113 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:54.371 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:54.630 [2024-07-12 13:49:42.974551] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:54.630 [2024-07-12 13:49:42.974678] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:54.630 [2024-07-12 13:49:42.974694] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:54.630 request: 00:23:54.630 { 00:23:54.630 "base_bdev": "BaseBdev1", 00:23:54.630 "raid_bdev": "raid_bdev1", 00:23:54.630 "method": "bdev_raid_add_base_bdev", 00:23:54.630 "req_id": 1 00:23:54.630 } 00:23:54.630 Got JSON-RPC error response 00:23:54.630 response: 00:23:54.630 { 00:23:54.630 "code": -22, 00:23:54.630 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:54.630 } 00:23:54.630 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:23:54.630 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:54.630 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:54.630 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:54.630 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:55.568 13:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:55.568 13:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:55.568 13:49:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:55.568 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:55.568 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:55.568 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:55.568 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:55.568 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:55.568 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:55.568 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:55.568 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.568 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.827 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.827 "name": "raid_bdev1", 00:23:55.827 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:55.827 "strip_size_kb": 0, 00:23:55.827 "state": "online", 00:23:55.827 "raid_level": "raid1", 00:23:55.827 "superblock": true, 00:23:55.827 "num_base_bdevs": 2, 00:23:55.827 "num_base_bdevs_discovered": 1, 00:23:55.827 "num_base_bdevs_operational": 1, 00:23:55.827 "base_bdevs_list": [ 00:23:55.827 { 00:23:55.827 "name": null, 00:23:55.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:55.827 "is_configured": false, 00:23:55.827 "data_offset": 2048, 00:23:55.827 "data_size": 63488 00:23:55.827 }, 00:23:55.827 { 00:23:55.827 "name": "BaseBdev2", 00:23:55.827 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:55.827 "is_configured": true, 00:23:55.827 "data_offset": 2048, 00:23:55.827 "data_size": 63488 00:23:55.827 } 00:23:55.827 ] 00:23:55.827 }' 00:23:55.827 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.827 13:49:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:56.395 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:56.395 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:56.395 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:56.395 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:56.395 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:56.395 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.395 13:49:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:56.654 "name": "raid_bdev1", 00:23:56.654 "uuid": "e36d703c-d59b-46c8-860a-c41e6cf8fb35", 00:23:56.654 "strip_size_kb": 0, 00:23:56.654 "state": "online", 00:23:56.654 "raid_level": "raid1", 00:23:56.654 "superblock": true, 00:23:56.654 "num_base_bdevs": 2, 00:23:56.654 "num_base_bdevs_discovered": 1, 00:23:56.654 "num_base_bdevs_operational": 1, 00:23:56.654 "base_bdevs_list": [ 00:23:56.654 { 00:23:56.654 "name": null, 00:23:56.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.654 "is_configured": false, 00:23:56.654 "data_offset": 2048, 00:23:56.654 "data_size": 63488 00:23:56.654 }, 00:23:56.654 { 00:23:56.654 "name": "BaseBdev2", 00:23:56.654 "uuid": "8536d9f8-9a2f-59db-ab4e-f5a4ee0492a2", 00:23:56.654 "is_configured": true, 00:23:56.654 "data_offset": 2048, 00:23:56.654 "data_size": 63488 00:23:56.654 } 00:23:56.654 ] 00:23:56.654 }' 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 541740 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 541740 ']' 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 541740 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:56.654 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 541740 00:23:56.913 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:56.913 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:56.913 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 541740' 00:23:56.913 killing process with pid 541740 00:23:56.913 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 541740 00:23:56.913 Received shutdown signal, test time was about 60.000000 seconds 00:23:56.913 00:23:56.913 Latency(us) 00:23:56.913 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:56.913 =================================================================================================================== 00:23:56.913 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:56.913 [2024-07-12 13:49:45.258544] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:56.913 [2024-07-12 13:49:45.258635] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:56.913 [2024-07-12 13:49:45.258676] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:56.913 [2024-07-12 13:49:45.258688] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xad4ba0 name raid_bdev1, state offline 00:23:56.913 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 541740 00:23:56.913 [2024-07-12 13:49:45.285485] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:56.913 13:49:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:23:56.913 00:23:56.913 real 0m36.399s 00:23:56.913 user 0m52.185s 00:23:56.913 sys 0m7.156s 00:23:56.913 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:56.913 13:49:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:56.913 ************************************ 00:23:56.913 END TEST raid_rebuild_test_sb 00:23:56.913 ************************************ 00:23:57.173 13:49:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:57.173 13:49:45 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:23:57.173 13:49:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:57.173 13:49:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:57.173 13:49:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:57.173 ************************************ 00:23:57.173 START TEST raid_rebuild_test_io 00:23:57.173 ************************************ 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=546859 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 546859 /var/tmp/spdk-raid.sock 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 546859 ']' 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:57.173 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:57.173 13:49:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:57.173 [2024-07-12 13:49:45.643055] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:23:57.173 [2024-07-12 13:49:45.643122] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid546859 ] 00:23:57.173 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:57.173 Zero copy mechanism will not be used. 00:23:57.432 [2024-07-12 13:49:45.763765] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:57.432 [2024-07-12 13:49:45.862355] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:57.432 [2024-07-12 13:49:45.925875] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:57.432 [2024-07-12 13:49:45.925922] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:58.369 13:49:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:58.369 13:49:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:23:58.369 13:49:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:58.369 13:49:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:58.370 BaseBdev1_malloc 00:23:58.370 13:49:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:58.629 [2024-07-12 13:49:47.064242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:58.629 [2024-07-12 13:49:47.064292] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:58.629 [2024-07-12 13:49:47.064315] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1435680 00:23:58.629 [2024-07-12 13:49:47.064328] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:58.629 [2024-07-12 13:49:47.066069] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:58.629 [2024-07-12 13:49:47.066097] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:58.629 BaseBdev1 00:23:58.629 13:49:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:58.629 13:49:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:58.887 BaseBdev2_malloc 00:23:58.887 13:49:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:59.146 [2024-07-12 13:49:47.558395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:59.146 [2024-07-12 13:49:47.558440] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:59.146 [2024-07-12 13:49:47.558465] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14361a0 00:23:59.146 [2024-07-12 13:49:47.558477] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:59.146 [2024-07-12 13:49:47.560061] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:59.146 [2024-07-12 13:49:47.560088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:59.146 BaseBdev2 00:23:59.146 13:49:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:59.405 spare_malloc 00:23:59.406 13:49:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:59.664 spare_delay 00:23:59.664 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:59.923 [2024-07-12 13:49:48.292940] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:59.923 [2024-07-12 13:49:48.292987] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:59.923 [2024-07-12 13:49:48.293008] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15e4800 00:23:59.923 [2024-07-12 13:49:48.293021] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:59.923 [2024-07-12 13:49:48.294638] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:59.923 [2024-07-12 13:49:48.294666] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:59.923 spare 00:23:59.923 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:00.182 [2024-07-12 13:49:48.537591] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:00.182 [2024-07-12 13:49:48.538993] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:00.182 [2024-07-12 13:49:48.539071] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15e59b0 00:24:00.182 [2024-07-12 13:49:48.539082] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:00.182 [2024-07-12 13:49:48.539293] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15dedd0 00:24:00.182 [2024-07-12 13:49:48.539433] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15e59b0 00:24:00.182 [2024-07-12 13:49:48.539442] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15e59b0 00:24:00.182 [2024-07-12 13:49:48.539557] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.182 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.441 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:00.441 "name": "raid_bdev1", 00:24:00.441 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:00.441 "strip_size_kb": 0, 00:24:00.441 "state": "online", 00:24:00.441 "raid_level": "raid1", 00:24:00.441 "superblock": false, 00:24:00.441 "num_base_bdevs": 2, 00:24:00.441 "num_base_bdevs_discovered": 2, 00:24:00.441 "num_base_bdevs_operational": 2, 00:24:00.441 "base_bdevs_list": [ 00:24:00.441 { 00:24:00.442 "name": "BaseBdev1", 00:24:00.442 "uuid": "49975d32-96ae-5403-bd12-0370f8a3a131", 00:24:00.442 "is_configured": true, 00:24:00.442 "data_offset": 0, 00:24:00.442 "data_size": 65536 00:24:00.442 }, 00:24:00.442 { 00:24:00.442 "name": "BaseBdev2", 00:24:00.442 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:00.442 "is_configured": true, 00:24:00.442 "data_offset": 0, 00:24:00.442 "data_size": 65536 00:24:00.442 } 00:24:00.442 ] 00:24:00.442 }' 00:24:00.442 13:49:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:00.442 13:49:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:01.008 13:49:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:01.008 13:49:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:01.267 [2024-07-12 13:49:49.624784] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:01.267 13:49:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:01.267 13:49:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.267 13:49:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:01.525 13:49:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:01.525 13:49:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:01.525 13:49:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:01.525 13:49:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:01.525 [2024-07-12 13:49:50.007652] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15e0510 00:24:01.525 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:01.525 Zero copy mechanism will not be used. 00:24:01.525 Running I/O for 60 seconds... 00:24:01.784 [2024-07-12 13:49:50.142750] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:01.784 [2024-07-12 13:49:50.150913] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x15e0510 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.784 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:02.043 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:02.043 "name": "raid_bdev1", 00:24:02.043 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:02.043 "strip_size_kb": 0, 00:24:02.043 "state": "online", 00:24:02.043 "raid_level": "raid1", 00:24:02.043 "superblock": false, 00:24:02.043 "num_base_bdevs": 2, 00:24:02.043 "num_base_bdevs_discovered": 1, 00:24:02.043 "num_base_bdevs_operational": 1, 00:24:02.043 "base_bdevs_list": [ 00:24:02.043 { 00:24:02.043 "name": null, 00:24:02.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.043 "is_configured": false, 00:24:02.043 "data_offset": 0, 00:24:02.043 "data_size": 65536 00:24:02.043 }, 00:24:02.043 { 00:24:02.043 "name": "BaseBdev2", 00:24:02.043 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:02.043 "is_configured": true, 00:24:02.043 "data_offset": 0, 00:24:02.043 "data_size": 65536 00:24:02.043 } 00:24:02.043 ] 00:24:02.043 }' 00:24:02.043 13:49:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:02.043 13:49:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:02.611 13:49:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:02.871 [2024-07-12 13:49:51.336716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:02.871 13:49:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:02.871 [2024-07-12 13:49:51.395842] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15681f0 00:24:02.871 [2024-07-12 13:49:51.398175] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:03.130 [2024-07-12 13:49:51.530897] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:03.130 [2024-07-12 13:49:51.531326] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:03.389 [2024-07-12 13:49:51.760592] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:03.389 [2024-07-12 13:49:51.760806] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:03.957 [2024-07-12 13:49:52.263817] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:03.957 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:03.957 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:03.957 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:03.957 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:03.957 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:03.957 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.957 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.215 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:04.216 "name": "raid_bdev1", 00:24:04.216 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:04.216 "strip_size_kb": 0, 00:24:04.216 "state": "online", 00:24:04.216 "raid_level": "raid1", 00:24:04.216 "superblock": false, 00:24:04.216 "num_base_bdevs": 2, 00:24:04.216 "num_base_bdevs_discovered": 2, 00:24:04.216 "num_base_bdevs_operational": 2, 00:24:04.216 "process": { 00:24:04.216 "type": "rebuild", 00:24:04.216 "target": "spare", 00:24:04.216 "progress": { 00:24:04.216 "blocks": 12288, 00:24:04.216 "percent": 18 00:24:04.216 } 00:24:04.216 }, 00:24:04.216 "base_bdevs_list": [ 00:24:04.216 { 00:24:04.216 "name": "spare", 00:24:04.216 "uuid": "ae8b7dfd-0d5c-5509-8172-f35bf430ca0a", 00:24:04.216 "is_configured": true, 00:24:04.216 "data_offset": 0, 00:24:04.216 "data_size": 65536 00:24:04.216 }, 00:24:04.216 { 00:24:04.216 "name": "BaseBdev2", 00:24:04.216 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:04.216 "is_configured": true, 00:24:04.216 "data_offset": 0, 00:24:04.216 "data_size": 65536 00:24:04.216 } 00:24:04.216 ] 00:24:04.216 }' 00:24:04.216 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:04.216 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:04.216 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:04.216 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:04.216 13:49:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:04.216 [2024-07-12 13:49:52.751251] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:04.473 [2024-07-12 13:49:52.950268] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:04.473 [2024-07-12 13:49:53.034446] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:04.473 [2024-07-12 13:49:53.044231] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:04.473 [2024-07-12 13:49:53.044258] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:04.473 [2024-07-12 13:49:53.044268] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:04.730 [2024-07-12 13:49:53.066106] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x15e0510 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.730 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.989 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:04.989 "name": "raid_bdev1", 00:24:04.989 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:04.989 "strip_size_kb": 0, 00:24:04.989 "state": "online", 00:24:04.989 "raid_level": "raid1", 00:24:04.989 "superblock": false, 00:24:04.989 "num_base_bdevs": 2, 00:24:04.989 "num_base_bdevs_discovered": 1, 00:24:04.989 "num_base_bdevs_operational": 1, 00:24:04.989 "base_bdevs_list": [ 00:24:04.989 { 00:24:04.989 "name": null, 00:24:04.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.989 "is_configured": false, 00:24:04.989 "data_offset": 0, 00:24:04.989 "data_size": 65536 00:24:04.989 }, 00:24:04.989 { 00:24:04.989 "name": "BaseBdev2", 00:24:04.989 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:04.989 "is_configured": true, 00:24:04.989 "data_offset": 0, 00:24:04.989 "data_size": 65536 00:24:04.989 } 00:24:04.989 ] 00:24:04.989 }' 00:24:04.989 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:04.989 13:49:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:05.556 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:05.556 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:05.556 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:05.556 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:05.556 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:05.556 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.556 13:49:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.814 13:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:05.814 "name": "raid_bdev1", 00:24:05.814 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:05.814 "strip_size_kb": 0, 00:24:05.814 "state": "online", 00:24:05.814 "raid_level": "raid1", 00:24:05.814 "superblock": false, 00:24:05.814 "num_base_bdevs": 2, 00:24:05.814 "num_base_bdevs_discovered": 1, 00:24:05.814 "num_base_bdevs_operational": 1, 00:24:05.814 "base_bdevs_list": [ 00:24:05.814 { 00:24:05.814 "name": null, 00:24:05.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.814 "is_configured": false, 00:24:05.814 "data_offset": 0, 00:24:05.814 "data_size": 65536 00:24:05.814 }, 00:24:05.814 { 00:24:05.814 "name": "BaseBdev2", 00:24:05.814 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:05.814 "is_configured": true, 00:24:05.814 "data_offset": 0, 00:24:05.814 "data_size": 65536 00:24:05.814 } 00:24:05.814 ] 00:24:05.814 }' 00:24:05.814 13:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:05.814 13:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:05.814 13:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:05.814 13:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:05.814 13:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:06.072 [2024-07-12 13:49:54.590171] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:06.072 13:49:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:06.072 [2024-07-12 13:49:54.650010] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15e5d90 00:24:06.072 [2024-07-12 13:49:54.651512] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:06.329 [2024-07-12 13:49:54.762530] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:06.329 [2024-07-12 13:49:54.762833] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:06.329 [2024-07-12 13:49:54.889360] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:06.329 [2024-07-12 13:49:54.889484] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:06.895 [2024-07-12 13:49:55.245586] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:06.895 [2024-07-12 13:49:55.447516] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:06.895 [2024-07-12 13:49:55.447677] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:07.153 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:07.153 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.153 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:07.153 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:07.153 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.153 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.153 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.411 [2024-07-12 13:49:55.788293] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:07.411 "name": "raid_bdev1", 00:24:07.411 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:07.411 "strip_size_kb": 0, 00:24:07.411 "state": "online", 00:24:07.411 "raid_level": "raid1", 00:24:07.411 "superblock": false, 00:24:07.411 "num_base_bdevs": 2, 00:24:07.411 "num_base_bdevs_discovered": 2, 00:24:07.411 "num_base_bdevs_operational": 2, 00:24:07.411 "process": { 00:24:07.411 "type": "rebuild", 00:24:07.411 "target": "spare", 00:24:07.411 "progress": { 00:24:07.411 "blocks": 16384, 00:24:07.411 "percent": 25 00:24:07.411 } 00:24:07.411 }, 00:24:07.411 "base_bdevs_list": [ 00:24:07.411 { 00:24:07.411 "name": "spare", 00:24:07.411 "uuid": "ae8b7dfd-0d5c-5509-8172-f35bf430ca0a", 00:24:07.411 "is_configured": true, 00:24:07.411 "data_offset": 0, 00:24:07.411 "data_size": 65536 00:24:07.411 }, 00:24:07.411 { 00:24:07.411 "name": "BaseBdev2", 00:24:07.411 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:07.411 "is_configured": true, 00:24:07.411 "data_offset": 0, 00:24:07.411 "data_size": 65536 00:24:07.411 } 00:24:07.411 ] 00:24:07.411 }' 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=859 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.411 13:49:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.669 [2024-07-12 13:49:56.152844] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:07.669 13:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:07.669 "name": "raid_bdev1", 00:24:07.669 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:07.669 "strip_size_kb": 0, 00:24:07.669 "state": "online", 00:24:07.669 "raid_level": "raid1", 00:24:07.669 "superblock": false, 00:24:07.669 "num_base_bdevs": 2, 00:24:07.669 "num_base_bdevs_discovered": 2, 00:24:07.669 "num_base_bdevs_operational": 2, 00:24:07.669 "process": { 00:24:07.669 "type": "rebuild", 00:24:07.669 "target": "spare", 00:24:07.669 "progress": { 00:24:07.669 "blocks": 18432, 00:24:07.669 "percent": 28 00:24:07.669 } 00:24:07.669 }, 00:24:07.669 "base_bdevs_list": [ 00:24:07.669 { 00:24:07.669 "name": "spare", 00:24:07.669 "uuid": "ae8b7dfd-0d5c-5509-8172-f35bf430ca0a", 00:24:07.669 "is_configured": true, 00:24:07.669 "data_offset": 0, 00:24:07.669 "data_size": 65536 00:24:07.669 }, 00:24:07.669 { 00:24:07.669 "name": "BaseBdev2", 00:24:07.669 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:07.669 "is_configured": true, 00:24:07.669 "data_offset": 0, 00:24:07.669 "data_size": 65536 00:24:07.669 } 00:24:07.669 ] 00:24:07.669 }' 00:24:07.669 13:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:07.669 13:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:07.669 13:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:07.927 13:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:07.927 13:49:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:07.927 [2024-07-12 13:49:56.372536] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:08.863 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:08.863 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:08.863 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:08.863 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:08.863 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:08.863 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:08.863 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.863 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.122 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.122 "name": "raid_bdev1", 00:24:09.122 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:09.122 "strip_size_kb": 0, 00:24:09.122 "state": "online", 00:24:09.122 "raid_level": "raid1", 00:24:09.122 "superblock": false, 00:24:09.122 "num_base_bdevs": 2, 00:24:09.122 "num_base_bdevs_discovered": 2, 00:24:09.122 "num_base_bdevs_operational": 2, 00:24:09.122 "process": { 00:24:09.122 "type": "rebuild", 00:24:09.122 "target": "spare", 00:24:09.122 "progress": { 00:24:09.122 "blocks": 38912, 00:24:09.122 "percent": 59 00:24:09.122 } 00:24:09.122 }, 00:24:09.122 "base_bdevs_list": [ 00:24:09.122 { 00:24:09.122 "name": "spare", 00:24:09.122 "uuid": "ae8b7dfd-0d5c-5509-8172-f35bf430ca0a", 00:24:09.122 "is_configured": true, 00:24:09.122 "data_offset": 0, 00:24:09.122 "data_size": 65536 00:24:09.122 }, 00:24:09.122 { 00:24:09.122 "name": "BaseBdev2", 00:24:09.122 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:09.122 "is_configured": true, 00:24:09.122 "data_offset": 0, 00:24:09.122 "data_size": 65536 00:24:09.122 } 00:24:09.122 ] 00:24:09.122 }' 00:24:09.122 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.122 [2024-07-12 13:49:57.530628] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:24:09.122 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:09.122 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.122 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:09.122 13:49:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:09.381 [2024-07-12 13:49:57.868906] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:09.639 [2024-07-12 13:49:58.088247] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:24:10.205 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:10.205 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.205 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.206 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:10.206 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:10.206 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.206 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.206 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.206 [2024-07-12 13:49:58.765911] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:24:10.465 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.465 "name": "raid_bdev1", 00:24:10.465 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:10.465 "strip_size_kb": 0, 00:24:10.465 "state": "online", 00:24:10.465 "raid_level": "raid1", 00:24:10.465 "superblock": false, 00:24:10.465 "num_base_bdevs": 2, 00:24:10.465 "num_base_bdevs_discovered": 2, 00:24:10.465 "num_base_bdevs_operational": 2, 00:24:10.465 "process": { 00:24:10.465 "type": "rebuild", 00:24:10.465 "target": "spare", 00:24:10.465 "progress": { 00:24:10.465 "blocks": 57344, 00:24:10.465 "percent": 87 00:24:10.465 } 00:24:10.465 }, 00:24:10.465 "base_bdevs_list": [ 00:24:10.465 { 00:24:10.465 "name": "spare", 00:24:10.465 "uuid": "ae8b7dfd-0d5c-5509-8172-f35bf430ca0a", 00:24:10.465 "is_configured": true, 00:24:10.465 "data_offset": 0, 00:24:10.465 "data_size": 65536 00:24:10.465 }, 00:24:10.465 { 00:24:10.465 "name": "BaseBdev2", 00:24:10.465 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:10.465 "is_configured": true, 00:24:10.465 "data_offset": 0, 00:24:10.465 "data_size": 65536 00:24:10.465 } 00:24:10.465 ] 00:24:10.465 }' 00:24:10.465 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.465 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:10.465 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.465 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:10.465 13:49:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:11.031 [2024-07-12 13:49:59.315030] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:11.031 [2024-07-12 13:49:59.415279] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:11.031 [2024-07-12 13:49:59.417037] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:11.598 13:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:11.598 13:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:11.598 13:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.598 13:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:11.598 13:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:11.598 13:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.598 13:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.598 13:49:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.857 "name": "raid_bdev1", 00:24:11.857 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:11.857 "strip_size_kb": 0, 00:24:11.857 "state": "online", 00:24:11.857 "raid_level": "raid1", 00:24:11.857 "superblock": false, 00:24:11.857 "num_base_bdevs": 2, 00:24:11.857 "num_base_bdevs_discovered": 2, 00:24:11.857 "num_base_bdevs_operational": 2, 00:24:11.857 "base_bdevs_list": [ 00:24:11.857 { 00:24:11.857 "name": "spare", 00:24:11.857 "uuid": "ae8b7dfd-0d5c-5509-8172-f35bf430ca0a", 00:24:11.857 "is_configured": true, 00:24:11.857 "data_offset": 0, 00:24:11.857 "data_size": 65536 00:24:11.857 }, 00:24:11.857 { 00:24:11.857 "name": "BaseBdev2", 00:24:11.857 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:11.857 "is_configured": true, 00:24:11.857 "data_offset": 0, 00:24:11.857 "data_size": 65536 00:24:11.857 } 00:24:11.857 ] 00:24:11.857 }' 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.857 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:12.116 "name": "raid_bdev1", 00:24:12.116 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:12.116 "strip_size_kb": 0, 00:24:12.116 "state": "online", 00:24:12.116 "raid_level": "raid1", 00:24:12.116 "superblock": false, 00:24:12.116 "num_base_bdevs": 2, 00:24:12.116 "num_base_bdevs_discovered": 2, 00:24:12.116 "num_base_bdevs_operational": 2, 00:24:12.116 "base_bdevs_list": [ 00:24:12.116 { 00:24:12.116 "name": "spare", 00:24:12.116 "uuid": "ae8b7dfd-0d5c-5509-8172-f35bf430ca0a", 00:24:12.116 "is_configured": true, 00:24:12.116 "data_offset": 0, 00:24:12.116 "data_size": 65536 00:24:12.116 }, 00:24:12.116 { 00:24:12.116 "name": "BaseBdev2", 00:24:12.116 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:12.116 "is_configured": true, 00:24:12.116 "data_offset": 0, 00:24:12.116 "data_size": 65536 00:24:12.116 } 00:24:12.116 ] 00:24:12.116 }' 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.116 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.375 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:12.375 "name": "raid_bdev1", 00:24:12.375 "uuid": "d40698c7-5941-44f2-9dcd-bb140f623aa0", 00:24:12.375 "strip_size_kb": 0, 00:24:12.375 "state": "online", 00:24:12.375 "raid_level": "raid1", 00:24:12.375 "superblock": false, 00:24:12.375 "num_base_bdevs": 2, 00:24:12.375 "num_base_bdevs_discovered": 2, 00:24:12.375 "num_base_bdevs_operational": 2, 00:24:12.375 "base_bdevs_list": [ 00:24:12.375 { 00:24:12.375 "name": "spare", 00:24:12.375 "uuid": "ae8b7dfd-0d5c-5509-8172-f35bf430ca0a", 00:24:12.375 "is_configured": true, 00:24:12.375 "data_offset": 0, 00:24:12.375 "data_size": 65536 00:24:12.375 }, 00:24:12.375 { 00:24:12.375 "name": "BaseBdev2", 00:24:12.375 "uuid": "044cf3b1-63b0-5b74-9742-b215a0b55288", 00:24:12.376 "is_configured": true, 00:24:12.376 "data_offset": 0, 00:24:12.376 "data_size": 65536 00:24:12.376 } 00:24:12.376 ] 00:24:12.376 }' 00:24:12.376 13:50:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:12.376 13:50:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:12.943 13:50:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:13.201 [2024-07-12 13:50:01.661504] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:13.201 [2024-07-12 13:50:01.661540] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:13.201 00:24:13.201 Latency(us) 00:24:13.201 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:13.201 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:13.201 raid_bdev1 : 11.69 97.46 292.37 0.00 0.00 14450.86 283.16 118534.68 00:24:13.201 =================================================================================================================== 00:24:13.201 Total : 97.46 292.37 0.00 0.00 14450.86 283.16 118534.68 00:24:13.201 [2024-07-12 13:50:01.729604] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:13.201 [2024-07-12 13:50:01.729632] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:13.201 [2024-07-12 13:50:01.729704] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:13.201 [2024-07-12 13:50:01.729716] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15e59b0 name raid_bdev1, state offline 00:24:13.201 0 00:24:13.201 13:50:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.202 13:50:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:13.460 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:13.719 /dev/nbd0 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:13.719 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:13.719 1+0 records in 00:24:13.719 1+0 records out 00:24:13.719 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274209 s, 14.9 MB/s 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:13.720 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:13.978 /dev/nbd1 00:24:13.978 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:13.978 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:13.978 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:13.978 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:13.978 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:13.978 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:13.979 1+0 records in 00:24:13.979 1+0 records out 00:24:13.979 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290721 s, 14.1 MB/s 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:13.979 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:14.237 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:14.237 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:14.237 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:14.237 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:14.237 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:14.237 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:14.237 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:14.496 13:50:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:14.496 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:14.496 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:14.496 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:14.496 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:14.496 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:14.496 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:14.496 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:14.496 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:14.497 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:14.497 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 546859 00:24:14.497 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 546859 ']' 00:24:14.497 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 546859 00:24:14.497 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:24:14.497 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:14.497 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 546859 00:24:14.755 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:14.755 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:14.755 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 546859' 00:24:14.755 killing process with pid 546859 00:24:14.755 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 546859 00:24:14.755 Received shutdown signal, test time was about 13.068781 seconds 00:24:14.755 00:24:14.755 Latency(us) 00:24:14.755 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:14.755 =================================================================================================================== 00:24:14.755 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:14.755 [2024-07-12 13:50:03.110587] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:14.755 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 546859 00:24:14.755 [2024-07-12 13:50:03.132846] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:15.014 00:24:15.014 real 0m17.790s 00:24:15.014 user 0m27.124s 00:24:15.014 sys 0m2.748s 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:15.014 ************************************ 00:24:15.014 END TEST raid_rebuild_test_io 00:24:15.014 ************************************ 00:24:15.014 13:50:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:15.014 13:50:03 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:24:15.014 13:50:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:15.014 13:50:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:15.014 13:50:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:15.014 ************************************ 00:24:15.014 START TEST raid_rebuild_test_sb_io 00:24:15.014 ************************************ 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=549390 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 549390 /var/tmp/spdk-raid.sock 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 549390 ']' 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:15.014 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:15.014 13:50:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:15.014 [2024-07-12 13:50:03.524671] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:24:15.014 [2024-07-12 13:50:03.524750] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid549390 ] 00:24:15.014 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:15.014 Zero copy mechanism will not be used. 00:24:15.273 [2024-07-12 13:50:03.655717] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:15.273 [2024-07-12 13:50:03.756976] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:15.273 [2024-07-12 13:50:03.811877] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:15.273 [2024-07-12 13:50:03.811911] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:16.208 13:50:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:16.208 13:50:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:24:16.208 13:50:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:16.208 13:50:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:16.208 BaseBdev1_malloc 00:24:16.208 13:50:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:16.467 [2024-07-12 13:50:04.943004] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:16.467 [2024-07-12 13:50:04.943055] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:16.467 [2024-07-12 13:50:04.943079] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8da680 00:24:16.467 [2024-07-12 13:50:04.943092] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:16.467 [2024-07-12 13:50:04.944752] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:16.467 [2024-07-12 13:50:04.944780] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:16.467 BaseBdev1 00:24:16.467 13:50:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:16.467 13:50:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:16.727 BaseBdev2_malloc 00:24:16.727 13:50:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:16.985 [2024-07-12 13:50:05.437188] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:16.985 [2024-07-12 13:50:05.437237] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:16.985 [2024-07-12 13:50:05.437260] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8db1a0 00:24:16.985 [2024-07-12 13:50:05.437273] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:16.985 [2024-07-12 13:50:05.438738] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:16.985 [2024-07-12 13:50:05.438766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:16.985 BaseBdev2 00:24:16.985 13:50:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:17.243 spare_malloc 00:24:17.243 13:50:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:17.501 spare_delay 00:24:17.501 13:50:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:17.758 [2024-07-12 13:50:06.187861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:17.758 [2024-07-12 13:50:06.187907] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:17.758 [2024-07-12 13:50:06.187935] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa89800 00:24:17.758 [2024-07-12 13:50:06.187948] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:17.758 [2024-07-12 13:50:06.189360] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:17.758 [2024-07-12 13:50:06.189387] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:17.758 spare 00:24:17.758 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:18.016 [2024-07-12 13:50:06.436535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:18.016 [2024-07-12 13:50:06.437686] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:18.016 [2024-07-12 13:50:06.437838] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa8a9b0 00:24:18.016 [2024-07-12 13:50:06.437851] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:18.016 [2024-07-12 13:50:06.438033] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa83dd0 00:24:18.016 [2024-07-12 13:50:06.438165] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa8a9b0 00:24:18.016 [2024-07-12 13:50:06.438175] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa8a9b0 00:24:18.016 [2024-07-12 13:50:06.438264] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.016 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.274 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:18.274 "name": "raid_bdev1", 00:24:18.274 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:18.274 "strip_size_kb": 0, 00:24:18.274 "state": "online", 00:24:18.274 "raid_level": "raid1", 00:24:18.274 "superblock": true, 00:24:18.274 "num_base_bdevs": 2, 00:24:18.274 "num_base_bdevs_discovered": 2, 00:24:18.274 "num_base_bdevs_operational": 2, 00:24:18.274 "base_bdevs_list": [ 00:24:18.274 { 00:24:18.274 "name": "BaseBdev1", 00:24:18.274 "uuid": "f7d3ec93-bf39-51ce-8a67-5379980966ac", 00:24:18.274 "is_configured": true, 00:24:18.274 "data_offset": 2048, 00:24:18.274 "data_size": 63488 00:24:18.274 }, 00:24:18.274 { 00:24:18.274 "name": "BaseBdev2", 00:24:18.274 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:18.274 "is_configured": true, 00:24:18.274 "data_offset": 2048, 00:24:18.274 "data_size": 63488 00:24:18.274 } 00:24:18.274 ] 00:24:18.274 }' 00:24:18.274 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:18.274 13:50:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:19.208 13:50:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:19.208 13:50:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:19.208 [2024-07-12 13:50:07.648003] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:19.208 13:50:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:19.208 13:50:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.208 13:50:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:19.465 13:50:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:19.465 13:50:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:19.465 13:50:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:19.465 13:50:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:19.465 [2024-07-12 13:50:08.022937] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa8b590 00:24:19.465 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:19.465 Zero copy mechanism will not be used. 00:24:19.465 Running I/O for 60 seconds... 00:24:19.723 [2024-07-12 13:50:08.140457] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:19.723 [2024-07-12 13:50:08.140664] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa8b590 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.723 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:19.982 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:19.982 "name": "raid_bdev1", 00:24:19.982 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:19.982 "strip_size_kb": 0, 00:24:19.982 "state": "online", 00:24:19.982 "raid_level": "raid1", 00:24:19.982 "superblock": true, 00:24:19.982 "num_base_bdevs": 2, 00:24:19.982 "num_base_bdevs_discovered": 1, 00:24:19.982 "num_base_bdevs_operational": 1, 00:24:19.982 "base_bdevs_list": [ 00:24:19.982 { 00:24:19.982 "name": null, 00:24:19.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:19.982 "is_configured": false, 00:24:19.982 "data_offset": 2048, 00:24:19.982 "data_size": 63488 00:24:19.982 }, 00:24:19.982 { 00:24:19.982 "name": "BaseBdev2", 00:24:19.982 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:19.982 "is_configured": true, 00:24:19.982 "data_offset": 2048, 00:24:19.982 "data_size": 63488 00:24:19.982 } 00:24:19.982 ] 00:24:19.982 }' 00:24:19.982 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:19.982 13:50:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:20.548 13:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:20.807 [2024-07-12 13:50:09.321182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:20.807 13:50:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:20.807 [2024-07-12 13:50:09.380655] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9f6b70 00:24:20.807 [2024-07-12 13:50:09.383012] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:21.066 [2024-07-12 13:50:09.501868] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:21.066 [2024-07-12 13:50:09.502340] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:21.324 [2024-07-12 13:50:09.736831] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:21.324 [2024-07-12 13:50:09.745223] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:21.583 [2024-07-12 13:50:10.095865] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:21.842 [2024-07-12 13:50:10.324574] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:21.842 [2024-07-12 13:50:10.324822] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:21.842 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:21.842 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:21.842 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:21.842 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:21.842 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:21.842 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.842 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.100 [2024-07-12 13:50:10.555472] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:22.101 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:22.101 "name": "raid_bdev1", 00:24:22.101 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:22.101 "strip_size_kb": 0, 00:24:22.101 "state": "online", 00:24:22.101 "raid_level": "raid1", 00:24:22.101 "superblock": true, 00:24:22.101 "num_base_bdevs": 2, 00:24:22.101 "num_base_bdevs_discovered": 2, 00:24:22.101 "num_base_bdevs_operational": 2, 00:24:22.101 "process": { 00:24:22.101 "type": "rebuild", 00:24:22.101 "target": "spare", 00:24:22.101 "progress": { 00:24:22.101 "blocks": 14336, 00:24:22.101 "percent": 22 00:24:22.101 } 00:24:22.101 }, 00:24:22.101 "base_bdevs_list": [ 00:24:22.101 { 00:24:22.101 "name": "spare", 00:24:22.101 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:22.101 "is_configured": true, 00:24:22.101 "data_offset": 2048, 00:24:22.101 "data_size": 63488 00:24:22.101 }, 00:24:22.101 { 00:24:22.101 "name": "BaseBdev2", 00:24:22.101 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:22.101 "is_configured": true, 00:24:22.101 "data_offset": 2048, 00:24:22.101 "data_size": 63488 00:24:22.101 } 00:24:22.101 ] 00:24:22.101 }' 00:24:22.101 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:22.101 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:22.359 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:22.359 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:22.359 13:50:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:22.359 [2024-07-12 13:50:10.933451] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:22.618 [2024-07-12 13:50:10.958960] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:22.618 [2024-07-12 13:50:11.035357] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:22.618 [2024-07-12 13:50:11.035609] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:22.618 [2024-07-12 13:50:11.136575] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:22.618 [2024-07-12 13:50:11.146623] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:22.618 [2024-07-12 13:50:11.146650] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:22.618 [2024-07-12 13:50:11.146661] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:22.618 [2024-07-12 13:50:11.177110] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa8b590 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.877 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.136 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:23.136 "name": "raid_bdev1", 00:24:23.136 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:23.136 "strip_size_kb": 0, 00:24:23.136 "state": "online", 00:24:23.136 "raid_level": "raid1", 00:24:23.136 "superblock": true, 00:24:23.136 "num_base_bdevs": 2, 00:24:23.136 "num_base_bdevs_discovered": 1, 00:24:23.136 "num_base_bdevs_operational": 1, 00:24:23.136 "base_bdevs_list": [ 00:24:23.136 { 00:24:23.136 "name": null, 00:24:23.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.136 "is_configured": false, 00:24:23.136 "data_offset": 2048, 00:24:23.136 "data_size": 63488 00:24:23.136 }, 00:24:23.136 { 00:24:23.136 "name": "BaseBdev2", 00:24:23.136 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:23.136 "is_configured": true, 00:24:23.136 "data_offset": 2048, 00:24:23.136 "data_size": 63488 00:24:23.136 } 00:24:23.136 ] 00:24:23.136 }' 00:24:23.136 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:23.136 13:50:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:23.703 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:23.703 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:23.703 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:23.703 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:23.703 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:23.703 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.704 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.002 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:24.002 "name": "raid_bdev1", 00:24:24.002 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:24.002 "strip_size_kb": 0, 00:24:24.002 "state": "online", 00:24:24.002 "raid_level": "raid1", 00:24:24.002 "superblock": true, 00:24:24.002 "num_base_bdevs": 2, 00:24:24.002 "num_base_bdevs_discovered": 1, 00:24:24.002 "num_base_bdevs_operational": 1, 00:24:24.002 "base_bdevs_list": [ 00:24:24.002 { 00:24:24.002 "name": null, 00:24:24.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.002 "is_configured": false, 00:24:24.002 "data_offset": 2048, 00:24:24.002 "data_size": 63488 00:24:24.002 }, 00:24:24.002 { 00:24:24.002 "name": "BaseBdev2", 00:24:24.002 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:24.002 "is_configured": true, 00:24:24.002 "data_offset": 2048, 00:24:24.002 "data_size": 63488 00:24:24.002 } 00:24:24.002 ] 00:24:24.002 }' 00:24:24.002 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:24.002 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:24.002 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:24.002 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:24.002 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:24.286 [2024-07-12 13:50:12.681019] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:24.286 [2024-07-12 13:50:12.733665] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa8b7a0 00:24:24.286 13:50:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:24.286 [2024-07-12 13:50:12.735202] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:24.286 [2024-07-12 13:50:12.854556] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:24.286 [2024-07-12 13:50:12.854937] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:24.644 [2024-07-12 13:50:12.966940] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:24.644 [2024-07-12 13:50:12.967088] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:24.925 [2024-07-12 13:50:13.364716] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:25.191 [2024-07-12 13:50:13.695277] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:25.191 13:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:25.191 13:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:25.191 13:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:25.191 13:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:25.191 13:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:25.191 13:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.191 13:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.450 [2024-07-12 13:50:13.831040] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:25.450 13:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:25.450 "name": "raid_bdev1", 00:24:25.450 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:25.450 "strip_size_kb": 0, 00:24:25.450 "state": "online", 00:24:25.450 "raid_level": "raid1", 00:24:25.450 "superblock": true, 00:24:25.450 "num_base_bdevs": 2, 00:24:25.450 "num_base_bdevs_discovered": 2, 00:24:25.450 "num_base_bdevs_operational": 2, 00:24:25.450 "process": { 00:24:25.450 "type": "rebuild", 00:24:25.450 "target": "spare", 00:24:25.450 "progress": { 00:24:25.450 "blocks": 16384, 00:24:25.450 "percent": 25 00:24:25.450 } 00:24:25.450 }, 00:24:25.450 "base_bdevs_list": [ 00:24:25.450 { 00:24:25.450 "name": "spare", 00:24:25.450 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:25.450 "is_configured": true, 00:24:25.450 "data_offset": 2048, 00:24:25.450 "data_size": 63488 00:24:25.450 }, 00:24:25.450 { 00:24:25.450 "name": "BaseBdev2", 00:24:25.450 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:25.450 "is_configured": true, 00:24:25.450 "data_offset": 2048, 00:24:25.450 "data_size": 63488 00:24:25.450 } 00:24:25.450 ] 00:24:25.450 }' 00:24:25.450 13:50:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:25.708 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=878 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.708 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:25.708 "name": "raid_bdev1", 00:24:25.708 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:25.708 "strip_size_kb": 0, 00:24:25.709 "state": "online", 00:24:25.709 "raid_level": "raid1", 00:24:25.709 "superblock": true, 00:24:25.709 "num_base_bdevs": 2, 00:24:25.709 "num_base_bdevs_discovered": 2, 00:24:25.709 "num_base_bdevs_operational": 2, 00:24:25.709 "process": { 00:24:25.709 "type": "rebuild", 00:24:25.709 "target": "spare", 00:24:25.709 "progress": { 00:24:25.709 "blocks": 20480, 00:24:25.709 "percent": 32 00:24:25.709 } 00:24:25.709 }, 00:24:25.709 "base_bdevs_list": [ 00:24:25.709 { 00:24:25.709 "name": "spare", 00:24:25.709 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:25.709 "is_configured": true, 00:24:25.709 "data_offset": 2048, 00:24:25.709 "data_size": 63488 00:24:25.709 }, 00:24:25.709 { 00:24:25.709 "name": "BaseBdev2", 00:24:25.709 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:25.709 "is_configured": true, 00:24:25.709 "data_offset": 2048, 00:24:25.709 "data_size": 63488 00:24:25.709 } 00:24:25.709 ] 00:24:25.709 }' 00:24:25.709 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:25.967 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:25.967 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.967 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:25.967 13:50:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:25.967 [2024-07-12 13:50:14.484867] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:26.533 [2024-07-12 13:50:14.951866] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:26.792 [2024-07-12 13:50:15.298812] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:26.792 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:26.792 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:26.792 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:26.792 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:26.792 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:26.792 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:26.792 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.792 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.050 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:27.050 "name": "raid_bdev1", 00:24:27.050 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:27.050 "strip_size_kb": 0, 00:24:27.050 "state": "online", 00:24:27.050 "raid_level": "raid1", 00:24:27.050 "superblock": true, 00:24:27.050 "num_base_bdevs": 2, 00:24:27.050 "num_base_bdevs_discovered": 2, 00:24:27.050 "num_base_bdevs_operational": 2, 00:24:27.050 "process": { 00:24:27.050 "type": "rebuild", 00:24:27.050 "target": "spare", 00:24:27.050 "progress": { 00:24:27.050 "blocks": 43008, 00:24:27.050 "percent": 67 00:24:27.050 } 00:24:27.050 }, 00:24:27.050 "base_bdevs_list": [ 00:24:27.050 { 00:24:27.050 "name": "spare", 00:24:27.050 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:27.050 "is_configured": true, 00:24:27.050 "data_offset": 2048, 00:24:27.050 "data_size": 63488 00:24:27.050 }, 00:24:27.050 { 00:24:27.050 "name": "BaseBdev2", 00:24:27.050 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:27.050 "is_configured": true, 00:24:27.050 "data_offset": 2048, 00:24:27.050 "data_size": 63488 00:24:27.050 } 00:24:27.050 ] 00:24:27.050 }' 00:24:27.050 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:27.050 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:27.050 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:27.309 [2024-07-12 13:50:15.637516] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:27.309 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:27.309 13:50:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:28.246 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:28.246 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:28.246 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:28.246 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:28.246 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:28.246 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:28.246 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.246 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.246 [2024-07-12 13:50:16.756275] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:28.505 [2024-07-12 13:50:16.864521] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:28.505 [2024-07-12 13:50:16.866365] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:28.505 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:28.505 "name": "raid_bdev1", 00:24:28.505 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:28.505 "strip_size_kb": 0, 00:24:28.505 "state": "online", 00:24:28.505 "raid_level": "raid1", 00:24:28.505 "superblock": true, 00:24:28.505 "num_base_bdevs": 2, 00:24:28.505 "num_base_bdevs_discovered": 2, 00:24:28.505 "num_base_bdevs_operational": 2, 00:24:28.505 "base_bdevs_list": [ 00:24:28.505 { 00:24:28.505 "name": "spare", 00:24:28.505 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:28.505 "is_configured": true, 00:24:28.505 "data_offset": 2048, 00:24:28.505 "data_size": 63488 00:24:28.505 }, 00:24:28.505 { 00:24:28.505 "name": "BaseBdev2", 00:24:28.505 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:28.505 "is_configured": true, 00:24:28.505 "data_offset": 2048, 00:24:28.505 "data_size": 63488 00:24:28.505 } 00:24:28.505 ] 00:24:28.505 }' 00:24:28.505 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:28.505 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:28.505 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:28.505 13:50:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:28.505 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:24:28.505 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:28.505 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:28.505 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:28.505 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:28.505 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:28.505 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.505 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:28.764 "name": "raid_bdev1", 00:24:28.764 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:28.764 "strip_size_kb": 0, 00:24:28.764 "state": "online", 00:24:28.764 "raid_level": "raid1", 00:24:28.764 "superblock": true, 00:24:28.764 "num_base_bdevs": 2, 00:24:28.764 "num_base_bdevs_discovered": 2, 00:24:28.764 "num_base_bdevs_operational": 2, 00:24:28.764 "base_bdevs_list": [ 00:24:28.764 { 00:24:28.764 "name": "spare", 00:24:28.764 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:28.764 "is_configured": true, 00:24:28.764 "data_offset": 2048, 00:24:28.764 "data_size": 63488 00:24:28.764 }, 00:24:28.764 { 00:24:28.764 "name": "BaseBdev2", 00:24:28.764 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:28.764 "is_configured": true, 00:24:28.764 "data_offset": 2048, 00:24:28.764 "data_size": 63488 00:24:28.764 } 00:24:28.764 ] 00:24:28.764 }' 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.764 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.023 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:29.023 "name": "raid_bdev1", 00:24:29.023 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:29.023 "strip_size_kb": 0, 00:24:29.023 "state": "online", 00:24:29.023 "raid_level": "raid1", 00:24:29.023 "superblock": true, 00:24:29.023 "num_base_bdevs": 2, 00:24:29.023 "num_base_bdevs_discovered": 2, 00:24:29.023 "num_base_bdevs_operational": 2, 00:24:29.023 "base_bdevs_list": [ 00:24:29.023 { 00:24:29.023 "name": "spare", 00:24:29.023 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:29.023 "is_configured": true, 00:24:29.023 "data_offset": 2048, 00:24:29.023 "data_size": 63488 00:24:29.023 }, 00:24:29.023 { 00:24:29.023 "name": "BaseBdev2", 00:24:29.023 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:29.023 "is_configured": true, 00:24:29.023 "data_offset": 2048, 00:24:29.023 "data_size": 63488 00:24:29.023 } 00:24:29.023 ] 00:24:29.023 }' 00:24:29.023 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:29.023 13:50:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:29.958 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:29.958 [2024-07-12 13:50:18.357593] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:29.958 [2024-07-12 13:50:18.357629] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:29.958 00:24:29.958 Latency(us) 00:24:29.958 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:29.958 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:29.958 raid_bdev1 : 10.36 104.30 312.89 0.00 0.00 12765.76 293.84 118534.68 00:24:29.958 =================================================================================================================== 00:24:29.958 Total : 104.30 312.89 0.00 0.00 12765.76 293.84 118534.68 00:24:29.958 [2024-07-12 13:50:18.409676] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:29.958 [2024-07-12 13:50:18.409705] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:29.958 [2024-07-12 13:50:18.409778] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:29.958 [2024-07-12 13:50:18.409790] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa8a9b0 name raid_bdev1, state offline 00:24:29.958 0 00:24:29.958 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.958 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:30.216 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:30.216 /dev/nbd0 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:30.475 1+0 records in 00:24:30.475 1+0 records out 00:24:30.475 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259196 s, 15.8 MB/s 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:30.475 13:50:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:30.733 /dev/nbd1 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:30.733 1+0 records in 00:24:30.733 1+0 records out 00:24:30.733 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270841 s, 15.1 MB/s 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:30.733 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:30.992 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:31.251 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:31.251 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:31.251 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:31.251 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:31.251 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:31.251 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:31.251 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:31.251 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:31.251 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:31.251 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:31.510 13:50:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:31.768 [2024-07-12 13:50:20.094615] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:31.768 [2024-07-12 13:50:20.094673] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:31.768 [2024-07-12 13:50:20.094699] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8eb7b0 00:24:31.768 [2024-07-12 13:50:20.094713] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:31.768 [2024-07-12 13:50:20.096367] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:31.768 [2024-07-12 13:50:20.096395] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:31.768 [2024-07-12 13:50:20.096480] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:31.768 [2024-07-12 13:50:20.096507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:31.769 [2024-07-12 13:50:20.096610] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:31.769 spare 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.769 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.769 [2024-07-12 13:50:20.196932] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa8ac30 00:24:31.769 [2024-07-12 13:50:20.196951] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:31.769 [2024-07-12 13:50:20.197144] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa83dd0 00:24:31.769 [2024-07-12 13:50:20.197291] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa8ac30 00:24:31.769 [2024-07-12 13:50:20.197301] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa8ac30 00:24:31.769 [2024-07-12 13:50:20.197408] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:32.027 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:32.027 "name": "raid_bdev1", 00:24:32.027 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:32.027 "strip_size_kb": 0, 00:24:32.027 "state": "online", 00:24:32.027 "raid_level": "raid1", 00:24:32.027 "superblock": true, 00:24:32.027 "num_base_bdevs": 2, 00:24:32.027 "num_base_bdevs_discovered": 2, 00:24:32.027 "num_base_bdevs_operational": 2, 00:24:32.027 "base_bdevs_list": [ 00:24:32.027 { 00:24:32.027 "name": "spare", 00:24:32.027 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:32.027 "is_configured": true, 00:24:32.027 "data_offset": 2048, 00:24:32.027 "data_size": 63488 00:24:32.027 }, 00:24:32.027 { 00:24:32.027 "name": "BaseBdev2", 00:24:32.027 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:32.027 "is_configured": true, 00:24:32.027 "data_offset": 2048, 00:24:32.027 "data_size": 63488 00:24:32.027 } 00:24:32.027 ] 00:24:32.027 }' 00:24:32.027 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:32.027 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:32.594 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:32.594 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:32.594 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:32.594 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:32.594 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:32.594 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.594 13:50:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.594 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:32.594 "name": "raid_bdev1", 00:24:32.594 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:32.594 "strip_size_kb": 0, 00:24:32.595 "state": "online", 00:24:32.595 "raid_level": "raid1", 00:24:32.595 "superblock": true, 00:24:32.595 "num_base_bdevs": 2, 00:24:32.595 "num_base_bdevs_discovered": 2, 00:24:32.595 "num_base_bdevs_operational": 2, 00:24:32.595 "base_bdevs_list": [ 00:24:32.595 { 00:24:32.595 "name": "spare", 00:24:32.595 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:32.595 "is_configured": true, 00:24:32.595 "data_offset": 2048, 00:24:32.595 "data_size": 63488 00:24:32.595 }, 00:24:32.595 { 00:24:32.595 "name": "BaseBdev2", 00:24:32.595 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:32.595 "is_configured": true, 00:24:32.595 "data_offset": 2048, 00:24:32.595 "data_size": 63488 00:24:32.595 } 00:24:32.595 ] 00:24:32.595 }' 00:24:32.595 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:32.595 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:32.595 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:32.852 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:32.852 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.852 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:33.110 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:33.110 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:33.369 [2024-07-12 13:50:21.759368] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.369 13:50:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.627 13:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.627 "name": "raid_bdev1", 00:24:33.627 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:33.627 "strip_size_kb": 0, 00:24:33.627 "state": "online", 00:24:33.627 "raid_level": "raid1", 00:24:33.627 "superblock": true, 00:24:33.627 "num_base_bdevs": 2, 00:24:33.627 "num_base_bdevs_discovered": 1, 00:24:33.627 "num_base_bdevs_operational": 1, 00:24:33.627 "base_bdevs_list": [ 00:24:33.627 { 00:24:33.627 "name": null, 00:24:33.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.627 "is_configured": false, 00:24:33.627 "data_offset": 2048, 00:24:33.627 "data_size": 63488 00:24:33.627 }, 00:24:33.627 { 00:24:33.627 "name": "BaseBdev2", 00:24:33.627 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:33.627 "is_configured": true, 00:24:33.627 "data_offset": 2048, 00:24:33.627 "data_size": 63488 00:24:33.627 } 00:24:33.627 ] 00:24:33.627 }' 00:24:33.627 13:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.627 13:50:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:34.191 13:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:34.449 [2024-07-12 13:50:22.850409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:34.449 [2024-07-12 13:50:22.850563] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:34.449 [2024-07-12 13:50:22.850580] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:34.449 [2024-07-12 13:50:22.850610] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:34.449 [2024-07-12 13:50:22.855839] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa83dd0 00:24:34.449 [2024-07-12 13:50:22.858166] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:34.449 13:50:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:35.383 13:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:35.383 13:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:35.383 13:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:35.383 13:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:35.383 13:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:35.383 13:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.383 13:50:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.641 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:35.641 "name": "raid_bdev1", 00:24:35.641 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:35.641 "strip_size_kb": 0, 00:24:35.641 "state": "online", 00:24:35.641 "raid_level": "raid1", 00:24:35.641 "superblock": true, 00:24:35.641 "num_base_bdevs": 2, 00:24:35.641 "num_base_bdevs_discovered": 2, 00:24:35.641 "num_base_bdevs_operational": 2, 00:24:35.641 "process": { 00:24:35.641 "type": "rebuild", 00:24:35.641 "target": "spare", 00:24:35.641 "progress": { 00:24:35.641 "blocks": 24576, 00:24:35.641 "percent": 38 00:24:35.641 } 00:24:35.641 }, 00:24:35.641 "base_bdevs_list": [ 00:24:35.641 { 00:24:35.641 "name": "spare", 00:24:35.641 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:35.641 "is_configured": true, 00:24:35.641 "data_offset": 2048, 00:24:35.641 "data_size": 63488 00:24:35.641 }, 00:24:35.641 { 00:24:35.641 "name": "BaseBdev2", 00:24:35.641 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:35.641 "is_configured": true, 00:24:35.641 "data_offset": 2048, 00:24:35.641 "data_size": 63488 00:24:35.641 } 00:24:35.641 ] 00:24:35.641 }' 00:24:35.641 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:35.641 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:35.641 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:35.641 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:35.641 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:35.899 [2024-07-12 13:50:24.444661] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:35.899 [2024-07-12 13:50:24.471015] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:35.899 [2024-07-12 13:50:24.471059] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:35.899 [2024-07-12 13:50:24.471074] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:35.899 [2024-07-12 13:50:24.471082] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.158 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.417 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:36.417 "name": "raid_bdev1", 00:24:36.417 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:36.417 "strip_size_kb": 0, 00:24:36.417 "state": "online", 00:24:36.417 "raid_level": "raid1", 00:24:36.417 "superblock": true, 00:24:36.417 "num_base_bdevs": 2, 00:24:36.417 "num_base_bdevs_discovered": 1, 00:24:36.417 "num_base_bdevs_operational": 1, 00:24:36.417 "base_bdevs_list": [ 00:24:36.417 { 00:24:36.417 "name": null, 00:24:36.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.417 "is_configured": false, 00:24:36.417 "data_offset": 2048, 00:24:36.417 "data_size": 63488 00:24:36.417 }, 00:24:36.417 { 00:24:36.417 "name": "BaseBdev2", 00:24:36.417 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:36.417 "is_configured": true, 00:24:36.417 "data_offset": 2048, 00:24:36.417 "data_size": 63488 00:24:36.417 } 00:24:36.417 ] 00:24:36.417 }' 00:24:36.417 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:36.417 13:50:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:36.984 13:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:37.242 [2024-07-12 13:50:25.567435] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:37.242 [2024-07-12 13:50:25.567488] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:37.242 [2024-07-12 13:50:25.567510] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa846e0 00:24:37.242 [2024-07-12 13:50:25.567522] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:37.242 [2024-07-12 13:50:25.567910] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:37.242 [2024-07-12 13:50:25.567938] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:37.242 [2024-07-12 13:50:25.568025] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:37.242 [2024-07-12 13:50:25.568037] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:37.242 [2024-07-12 13:50:25.568049] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:37.242 [2024-07-12 13:50:25.568068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:37.242 [2024-07-12 13:50:25.573319] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa83dd0 00:24:37.242 spare 00:24:37.242 [2024-07-12 13:50:25.574758] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:37.242 13:50:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:38.178 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:38.178 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.178 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:38.178 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:38.178 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.178 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.178 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.437 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.437 "name": "raid_bdev1", 00:24:38.437 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:38.437 "strip_size_kb": 0, 00:24:38.437 "state": "online", 00:24:38.437 "raid_level": "raid1", 00:24:38.437 "superblock": true, 00:24:38.437 "num_base_bdevs": 2, 00:24:38.437 "num_base_bdevs_discovered": 2, 00:24:38.437 "num_base_bdevs_operational": 2, 00:24:38.437 "process": { 00:24:38.437 "type": "rebuild", 00:24:38.437 "target": "spare", 00:24:38.437 "progress": { 00:24:38.437 "blocks": 24576, 00:24:38.437 "percent": 38 00:24:38.437 } 00:24:38.437 }, 00:24:38.437 "base_bdevs_list": [ 00:24:38.437 { 00:24:38.437 "name": "spare", 00:24:38.437 "uuid": "58d77626-2cde-5d13-b13f-6cf4152839d5", 00:24:38.437 "is_configured": true, 00:24:38.437 "data_offset": 2048, 00:24:38.437 "data_size": 63488 00:24:38.437 }, 00:24:38.437 { 00:24:38.437 "name": "BaseBdev2", 00:24:38.437 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:38.437 "is_configured": true, 00:24:38.437 "data_offset": 2048, 00:24:38.437 "data_size": 63488 00:24:38.437 } 00:24:38.437 ] 00:24:38.437 }' 00:24:38.437 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.437 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:38.437 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.437 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:38.437 13:50:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:38.696 [2024-07-12 13:50:27.147165] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:38.696 [2024-07-12 13:50:27.187673] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:38.696 [2024-07-12 13:50:27.187714] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:38.696 [2024-07-12 13:50:27.187730] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:38.696 [2024-07-12 13:50:27.187738] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.696 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.955 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:38.955 "name": "raid_bdev1", 00:24:38.955 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:38.955 "strip_size_kb": 0, 00:24:38.955 "state": "online", 00:24:38.955 "raid_level": "raid1", 00:24:38.955 "superblock": true, 00:24:38.955 "num_base_bdevs": 2, 00:24:38.955 "num_base_bdevs_discovered": 1, 00:24:38.955 "num_base_bdevs_operational": 1, 00:24:38.955 "base_bdevs_list": [ 00:24:38.955 { 00:24:38.955 "name": null, 00:24:38.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:38.955 "is_configured": false, 00:24:38.955 "data_offset": 2048, 00:24:38.955 "data_size": 63488 00:24:38.955 }, 00:24:38.955 { 00:24:38.955 "name": "BaseBdev2", 00:24:38.955 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:38.955 "is_configured": true, 00:24:38.955 "data_offset": 2048, 00:24:38.955 "data_size": 63488 00:24:38.955 } 00:24:38.955 ] 00:24:38.955 }' 00:24:38.955 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:38.955 13:50:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:39.522 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:39.522 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:39.522 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:39.522 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:39.522 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:39.522 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.522 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.779 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:39.779 "name": "raid_bdev1", 00:24:39.779 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:39.779 "strip_size_kb": 0, 00:24:39.779 "state": "online", 00:24:39.779 "raid_level": "raid1", 00:24:39.779 "superblock": true, 00:24:39.779 "num_base_bdevs": 2, 00:24:39.779 "num_base_bdevs_discovered": 1, 00:24:39.779 "num_base_bdevs_operational": 1, 00:24:39.779 "base_bdevs_list": [ 00:24:39.779 { 00:24:39.779 "name": null, 00:24:39.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.779 "is_configured": false, 00:24:39.779 "data_offset": 2048, 00:24:39.779 "data_size": 63488 00:24:39.779 }, 00:24:39.779 { 00:24:39.779 "name": "BaseBdev2", 00:24:39.779 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:39.779 "is_configured": true, 00:24:39.779 "data_offset": 2048, 00:24:39.779 "data_size": 63488 00:24:39.779 } 00:24:39.779 ] 00:24:39.779 }' 00:24:39.779 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:39.779 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:39.779 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:40.038 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:40.038 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:40.296 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:40.296 [2024-07-12 13:50:28.877043] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:40.296 [2024-07-12 13:50:28.877099] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:40.296 [2024-07-12 13:50:28.877120] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8ec3f0 00:24:40.296 [2024-07-12 13:50:28.877133] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:40.296 [2024-07-12 13:50:28.877500] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:40.296 [2024-07-12 13:50:28.877517] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:40.296 [2024-07-12 13:50:28.877589] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:40.296 [2024-07-12 13:50:28.877602] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:40.296 [2024-07-12 13:50:28.877613] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:40.554 BaseBdev1 00:24:40.554 13:50:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.488 13:50:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.746 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.746 "name": "raid_bdev1", 00:24:41.746 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:41.746 "strip_size_kb": 0, 00:24:41.746 "state": "online", 00:24:41.746 "raid_level": "raid1", 00:24:41.746 "superblock": true, 00:24:41.746 "num_base_bdevs": 2, 00:24:41.746 "num_base_bdevs_discovered": 1, 00:24:41.746 "num_base_bdevs_operational": 1, 00:24:41.746 "base_bdevs_list": [ 00:24:41.746 { 00:24:41.746 "name": null, 00:24:41.746 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.746 "is_configured": false, 00:24:41.746 "data_offset": 2048, 00:24:41.746 "data_size": 63488 00:24:41.746 }, 00:24:41.746 { 00:24:41.746 "name": "BaseBdev2", 00:24:41.746 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:41.746 "is_configured": true, 00:24:41.746 "data_offset": 2048, 00:24:41.746 "data_size": 63488 00:24:41.746 } 00:24:41.746 ] 00:24:41.746 }' 00:24:41.746 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.746 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:42.313 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:42.313 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:42.313 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:42.313 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:42.313 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:42.313 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.313 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.572 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:42.572 "name": "raid_bdev1", 00:24:42.572 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:42.572 "strip_size_kb": 0, 00:24:42.572 "state": "online", 00:24:42.572 "raid_level": "raid1", 00:24:42.572 "superblock": true, 00:24:42.572 "num_base_bdevs": 2, 00:24:42.572 "num_base_bdevs_discovered": 1, 00:24:42.572 "num_base_bdevs_operational": 1, 00:24:42.572 "base_bdevs_list": [ 00:24:42.572 { 00:24:42.572 "name": null, 00:24:42.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:42.572 "is_configured": false, 00:24:42.572 "data_offset": 2048, 00:24:42.572 "data_size": 63488 00:24:42.572 }, 00:24:42.572 { 00:24:42.572 "name": "BaseBdev2", 00:24:42.572 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:42.572 "is_configured": true, 00:24:42.572 "data_offset": 2048, 00:24:42.572 "data_size": 63488 00:24:42.572 } 00:24:42.572 ] 00:24:42.572 }' 00:24:42.572 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:42.572 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:42.572 13:50:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:42.572 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:42.831 [2024-07-12 13:50:31.267711] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:42.831 [2024-07-12 13:50:31.267841] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:42.831 [2024-07-12 13:50:31.267857] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:42.831 request: 00:24:42.831 { 00:24:42.831 "base_bdev": "BaseBdev1", 00:24:42.831 "raid_bdev": "raid_bdev1", 00:24:42.831 "method": "bdev_raid_add_base_bdev", 00:24:42.831 "req_id": 1 00:24:42.831 } 00:24:42.831 Got JSON-RPC error response 00:24:42.831 response: 00:24:42.831 { 00:24:42.831 "code": -22, 00:24:42.831 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:42.831 } 00:24:42.831 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:24:42.831 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:42.831 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:42.831 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:42.831 13:50:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.769 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.027 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.027 "name": "raid_bdev1", 00:24:44.027 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:44.027 "strip_size_kb": 0, 00:24:44.027 "state": "online", 00:24:44.027 "raid_level": "raid1", 00:24:44.027 "superblock": true, 00:24:44.027 "num_base_bdevs": 2, 00:24:44.027 "num_base_bdevs_discovered": 1, 00:24:44.027 "num_base_bdevs_operational": 1, 00:24:44.027 "base_bdevs_list": [ 00:24:44.027 { 00:24:44.027 "name": null, 00:24:44.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.027 "is_configured": false, 00:24:44.027 "data_offset": 2048, 00:24:44.027 "data_size": 63488 00:24:44.027 }, 00:24:44.027 { 00:24:44.027 "name": "BaseBdev2", 00:24:44.027 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:44.027 "is_configured": true, 00:24:44.027 "data_offset": 2048, 00:24:44.027 "data_size": 63488 00:24:44.027 } 00:24:44.027 ] 00:24:44.027 }' 00:24:44.027 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.027 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:44.594 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:44.594 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:44.594 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:44.594 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:44.594 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:44.594 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.594 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.852 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:44.852 "name": "raid_bdev1", 00:24:44.852 "uuid": "9cb3e5ef-9168-4dda-884f-d3a45bb2eddd", 00:24:44.852 "strip_size_kb": 0, 00:24:44.852 "state": "online", 00:24:44.852 "raid_level": "raid1", 00:24:44.852 "superblock": true, 00:24:44.852 "num_base_bdevs": 2, 00:24:44.852 "num_base_bdevs_discovered": 1, 00:24:44.852 "num_base_bdevs_operational": 1, 00:24:44.852 "base_bdevs_list": [ 00:24:44.852 { 00:24:44.852 "name": null, 00:24:44.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.852 "is_configured": false, 00:24:44.852 "data_offset": 2048, 00:24:44.852 "data_size": 63488 00:24:44.852 }, 00:24:44.852 { 00:24:44.852 "name": "BaseBdev2", 00:24:44.852 "uuid": "572f97fd-344f-5a90-860c-d761c529e1e5", 00:24:44.852 "is_configured": true, 00:24:44.852 "data_offset": 2048, 00:24:44.852 "data_size": 63488 00:24:44.852 } 00:24:44.852 ] 00:24:44.852 }' 00:24:44.852 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:44.852 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:44.852 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 549390 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 549390 ']' 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 549390 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 549390 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 549390' 00:24:45.117 killing process with pid 549390 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 549390 00:24:45.117 Received shutdown signal, test time was about 25.429603 seconds 00:24:45.117 00:24:45.117 Latency(us) 00:24:45.117 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:45.117 =================================================================================================================== 00:24:45.117 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:45.117 [2024-07-12 13:50:33.516977] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:45.117 [2024-07-12 13:50:33.517079] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:45.117 [2024-07-12 13:50:33.517124] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:45.117 [2024-07-12 13:50:33.517136] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa8ac30 name raid_bdev1, state offline 00:24:45.117 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 549390 00:24:45.117 [2024-07-12 13:50:33.542036] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:45.375 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:45.375 00:24:45.375 real 0m30.324s 00:24:45.375 user 0m47.780s 00:24:45.375 sys 0m4.591s 00:24:45.375 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:45.375 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:45.375 ************************************ 00:24:45.375 END TEST raid_rebuild_test_sb_io 00:24:45.376 ************************************ 00:24:45.376 13:50:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:45.376 13:50:33 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:24:45.376 13:50:33 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:24:45.376 13:50:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:45.376 13:50:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:45.376 13:50:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:45.376 ************************************ 00:24:45.376 START TEST raid_rebuild_test 00:24:45.376 ************************************ 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=553759 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 553759 /var/tmp/spdk-raid.sock 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 553759 ']' 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:45.376 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:45.376 13:50:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:45.376 [2024-07-12 13:50:33.939320] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:24:45.376 [2024-07-12 13:50:33.939389] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid553759 ] 00:24:45.376 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:45.376 Zero copy mechanism will not be used. 00:24:45.635 [2024-07-12 13:50:34.071107] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:45.635 [2024-07-12 13:50:34.173521] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:45.894 [2024-07-12 13:50:34.233418] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:45.894 [2024-07-12 13:50:34.233453] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:46.461 13:50:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:46.461 13:50:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:24:46.461 13:50:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:46.461 13:50:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:46.719 BaseBdev1_malloc 00:24:46.719 13:50:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:46.979 [2024-07-12 13:50:35.354157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:46.979 [2024-07-12 13:50:35.354209] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:46.979 [2024-07-12 13:50:35.354231] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc13680 00:24:46.979 [2024-07-12 13:50:35.354243] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:46.979 [2024-07-12 13:50:35.355827] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:46.979 [2024-07-12 13:50:35.355855] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:46.979 BaseBdev1 00:24:46.979 13:50:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:46.979 13:50:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:46.979 BaseBdev2_malloc 00:24:47.236 13:50:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:47.236 [2024-07-12 13:50:35.711913] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:47.236 [2024-07-12 13:50:35.711965] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:47.236 [2024-07-12 13:50:35.711988] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc141a0 00:24:47.236 [2024-07-12 13:50:35.712001] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:47.236 [2024-07-12 13:50:35.713386] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:47.236 [2024-07-12 13:50:35.713413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:47.236 BaseBdev2 00:24:47.236 13:50:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:47.236 13:50:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:47.495 BaseBdev3_malloc 00:24:47.495 13:50:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:47.495 [2024-07-12 13:50:36.073675] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:47.495 [2024-07-12 13:50:36.073722] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:47.495 [2024-07-12 13:50:36.073742] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc1230 00:24:47.495 [2024-07-12 13:50:36.073755] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:47.495 [2024-07-12 13:50:36.075225] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:47.495 [2024-07-12 13:50:36.075255] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:47.753 BaseBdev3 00:24:47.753 13:50:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:47.753 13:50:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:47.753 BaseBdev4_malloc 00:24:48.011 13:50:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:48.011 [2024-07-12 13:50:36.575655] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:48.011 [2024-07-12 13:50:36.575704] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:48.011 [2024-07-12 13:50:36.575723] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc0410 00:24:48.011 [2024-07-12 13:50:36.575736] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:48.011 [2024-07-12 13:50:36.577144] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:48.011 [2024-07-12 13:50:36.577172] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:48.011 BaseBdev4 00:24:48.269 13:50:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:48.269 spare_malloc 00:24:48.269 13:50:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:48.528 spare_delay 00:24:48.528 13:50:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:48.528 [2024-07-12 13:50:37.097487] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:48.528 [2024-07-12 13:50:37.097532] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:48.528 [2024-07-12 13:50:37.097550] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdc4ef0 00:24:48.528 [2024-07-12 13:50:37.097563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:48.528 [2024-07-12 13:50:37.098957] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:48.528 [2024-07-12 13:50:37.098983] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:48.528 spare 00:24:48.785 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:48.785 [2024-07-12 13:50:37.346172] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:48.785 [2024-07-12 13:50:37.347338] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:48.785 [2024-07-12 13:50:37.347392] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:48.785 [2024-07-12 13:50:37.347438] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:48.785 [2024-07-12 13:50:37.347513] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd441e0 00:24:48.785 [2024-07-12 13:50:37.347523] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:48.785 [2024-07-12 13:50:37.347712] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdbe750 00:24:48.785 [2024-07-12 13:50:37.347853] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd441e0 00:24:48.785 [2024-07-12 13:50:37.347863] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd441e0 00:24:48.785 [2024-07-12 13:50:37.347973] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:49.044 "name": "raid_bdev1", 00:24:49.044 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:24:49.044 "strip_size_kb": 0, 00:24:49.044 "state": "online", 00:24:49.044 "raid_level": "raid1", 00:24:49.044 "superblock": false, 00:24:49.044 "num_base_bdevs": 4, 00:24:49.044 "num_base_bdevs_discovered": 4, 00:24:49.044 "num_base_bdevs_operational": 4, 00:24:49.044 "base_bdevs_list": [ 00:24:49.044 { 00:24:49.044 "name": "BaseBdev1", 00:24:49.044 "uuid": "b83c0402-03bd-5697-87e3-d5c40161cd63", 00:24:49.044 "is_configured": true, 00:24:49.044 "data_offset": 0, 00:24:49.044 "data_size": 65536 00:24:49.044 }, 00:24:49.044 { 00:24:49.044 "name": "BaseBdev2", 00:24:49.044 "uuid": "e3f8406c-c611-542a-9beb-c750648d2e9a", 00:24:49.044 "is_configured": true, 00:24:49.044 "data_offset": 0, 00:24:49.044 "data_size": 65536 00:24:49.044 }, 00:24:49.044 { 00:24:49.044 "name": "BaseBdev3", 00:24:49.044 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:24:49.044 "is_configured": true, 00:24:49.044 "data_offset": 0, 00:24:49.044 "data_size": 65536 00:24:49.044 }, 00:24:49.044 { 00:24:49.044 "name": "BaseBdev4", 00:24:49.044 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:24:49.044 "is_configured": true, 00:24:49.044 "data_offset": 0, 00:24:49.044 "data_size": 65536 00:24:49.044 } 00:24:49.044 ] 00:24:49.044 }' 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:49.044 13:50:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:49.980 13:50:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:49.980 13:50:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:49.980 [2024-07-12 13:50:38.453423] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:49.980 13:50:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:49.980 13:50:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.980 13:50:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:50.238 13:50:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:50.239 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:50.497 [2024-07-12 13:50:38.954478] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdbe750 00:24:50.497 /dev/nbd0 00:24:50.497 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:50.497 13:50:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:50.497 13:50:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:50.497 13:50:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:50.497 13:50:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:50.497 13:50:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:50.498 13:50:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:50.498 13:50:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:50.498 13:50:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:50.498 13:50:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:50.498 13:50:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:50.498 1+0 records in 00:24:50.498 1+0 records out 00:24:50.498 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235763 s, 17.4 MB/s 00:24:50.498 13:50:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:50.498 13:50:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:50.498 13:50:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:50.498 13:50:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:50.498 13:50:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:50.498 13:50:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:50.498 13:50:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:50.498 13:50:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:50.498 13:50:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:50.498 13:50:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:24:58.611 65536+0 records in 00:24:58.611 65536+0 records out 00:24:58.611 33554432 bytes (34 MB, 32 MiB) copied, 8.12497 s, 4.1 MB/s 00:24:58.611 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:58.611 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:58.611 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:58.611 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:58.611 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:58.611 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:58.611 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:58.870 [2024-07-12 13:50:47.405612] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:58.870 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:58.870 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:58.870 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:58.870 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:58.870 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:58.870 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:58.870 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:58.870 13:50:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:58.870 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:59.129 [2024-07-12 13:50:47.638292] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.129 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.387 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:59.387 "name": "raid_bdev1", 00:24:59.387 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:24:59.387 "strip_size_kb": 0, 00:24:59.387 "state": "online", 00:24:59.387 "raid_level": "raid1", 00:24:59.387 "superblock": false, 00:24:59.387 "num_base_bdevs": 4, 00:24:59.387 "num_base_bdevs_discovered": 3, 00:24:59.387 "num_base_bdevs_operational": 3, 00:24:59.387 "base_bdevs_list": [ 00:24:59.387 { 00:24:59.387 "name": null, 00:24:59.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.387 "is_configured": false, 00:24:59.387 "data_offset": 0, 00:24:59.387 "data_size": 65536 00:24:59.387 }, 00:24:59.387 { 00:24:59.387 "name": "BaseBdev2", 00:24:59.387 "uuid": "e3f8406c-c611-542a-9beb-c750648d2e9a", 00:24:59.387 "is_configured": true, 00:24:59.387 "data_offset": 0, 00:24:59.387 "data_size": 65536 00:24:59.388 }, 00:24:59.388 { 00:24:59.388 "name": "BaseBdev3", 00:24:59.388 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:24:59.388 "is_configured": true, 00:24:59.388 "data_offset": 0, 00:24:59.388 "data_size": 65536 00:24:59.388 }, 00:24:59.388 { 00:24:59.388 "name": "BaseBdev4", 00:24:59.388 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:24:59.388 "is_configured": true, 00:24:59.388 "data_offset": 0, 00:24:59.388 "data_size": 65536 00:24:59.388 } 00:24:59.388 ] 00:24:59.388 }' 00:24:59.388 13:50:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:59.388 13:50:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:59.956 13:50:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:00.214 [2024-07-12 13:50:48.749250] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:00.214 [2024-07-12 13:50:48.753380] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd49ff0 00:25:00.214 [2024-07-12 13:50:48.755753] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:00.214 13:50:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:01.590 13:50:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:01.590 13:50:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:01.590 13:50:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:01.590 13:50:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:01.590 13:50:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:01.590 13:50:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.590 13:50:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.590 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.590 "name": "raid_bdev1", 00:25:01.590 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:25:01.590 "strip_size_kb": 0, 00:25:01.590 "state": "online", 00:25:01.590 "raid_level": "raid1", 00:25:01.590 "superblock": false, 00:25:01.590 "num_base_bdevs": 4, 00:25:01.590 "num_base_bdevs_discovered": 4, 00:25:01.590 "num_base_bdevs_operational": 4, 00:25:01.590 "process": { 00:25:01.590 "type": "rebuild", 00:25:01.590 "target": "spare", 00:25:01.590 "progress": { 00:25:01.590 "blocks": 24576, 00:25:01.590 "percent": 37 00:25:01.590 } 00:25:01.590 }, 00:25:01.590 "base_bdevs_list": [ 00:25:01.590 { 00:25:01.590 "name": "spare", 00:25:01.590 "uuid": "26000567-c727-564f-9fe1-8efb313fc5aa", 00:25:01.590 "is_configured": true, 00:25:01.590 "data_offset": 0, 00:25:01.590 "data_size": 65536 00:25:01.590 }, 00:25:01.590 { 00:25:01.590 "name": "BaseBdev2", 00:25:01.590 "uuid": "e3f8406c-c611-542a-9beb-c750648d2e9a", 00:25:01.590 "is_configured": true, 00:25:01.590 "data_offset": 0, 00:25:01.590 "data_size": 65536 00:25:01.590 }, 00:25:01.590 { 00:25:01.590 "name": "BaseBdev3", 00:25:01.590 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:25:01.590 "is_configured": true, 00:25:01.590 "data_offset": 0, 00:25:01.590 "data_size": 65536 00:25:01.590 }, 00:25:01.590 { 00:25:01.590 "name": "BaseBdev4", 00:25:01.590 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:25:01.590 "is_configured": true, 00:25:01.590 "data_offset": 0, 00:25:01.590 "data_size": 65536 00:25:01.590 } 00:25:01.590 ] 00:25:01.590 }' 00:25:01.590 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.590 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:01.590 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.590 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:01.590 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:01.849 [2024-07-12 13:50:50.362749] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:01.849 [2024-07-12 13:50:50.368278] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:01.849 [2024-07-12 13:50:50.368322] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:01.849 [2024-07-12 13:50:50.368340] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:01.849 [2024-07-12 13:50:50.368348] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.849 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.108 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:02.108 "name": "raid_bdev1", 00:25:02.108 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:25:02.108 "strip_size_kb": 0, 00:25:02.108 "state": "online", 00:25:02.108 "raid_level": "raid1", 00:25:02.108 "superblock": false, 00:25:02.108 "num_base_bdevs": 4, 00:25:02.108 "num_base_bdevs_discovered": 3, 00:25:02.108 "num_base_bdevs_operational": 3, 00:25:02.108 "base_bdevs_list": [ 00:25:02.108 { 00:25:02.108 "name": null, 00:25:02.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.108 "is_configured": false, 00:25:02.108 "data_offset": 0, 00:25:02.108 "data_size": 65536 00:25:02.108 }, 00:25:02.108 { 00:25:02.108 "name": "BaseBdev2", 00:25:02.108 "uuid": "e3f8406c-c611-542a-9beb-c750648d2e9a", 00:25:02.108 "is_configured": true, 00:25:02.108 "data_offset": 0, 00:25:02.108 "data_size": 65536 00:25:02.108 }, 00:25:02.108 { 00:25:02.108 "name": "BaseBdev3", 00:25:02.108 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:25:02.108 "is_configured": true, 00:25:02.108 "data_offset": 0, 00:25:02.108 "data_size": 65536 00:25:02.108 }, 00:25:02.108 { 00:25:02.108 "name": "BaseBdev4", 00:25:02.108 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:25:02.108 "is_configured": true, 00:25:02.108 "data_offset": 0, 00:25:02.108 "data_size": 65536 00:25:02.108 } 00:25:02.108 ] 00:25:02.108 }' 00:25:02.108 13:50:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:02.108 13:50:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:02.676 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:02.676 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:02.676 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:02.676 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:02.676 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:02.935 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.935 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.935 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:02.935 "name": "raid_bdev1", 00:25:02.935 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:25:02.935 "strip_size_kb": 0, 00:25:02.935 "state": "online", 00:25:02.935 "raid_level": "raid1", 00:25:02.935 "superblock": false, 00:25:02.935 "num_base_bdevs": 4, 00:25:02.935 "num_base_bdevs_discovered": 3, 00:25:02.935 "num_base_bdevs_operational": 3, 00:25:02.935 "base_bdevs_list": [ 00:25:02.935 { 00:25:02.935 "name": null, 00:25:02.935 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.935 "is_configured": false, 00:25:02.935 "data_offset": 0, 00:25:02.935 "data_size": 65536 00:25:02.935 }, 00:25:02.935 { 00:25:02.935 "name": "BaseBdev2", 00:25:02.935 "uuid": "e3f8406c-c611-542a-9beb-c750648d2e9a", 00:25:02.935 "is_configured": true, 00:25:02.935 "data_offset": 0, 00:25:02.935 "data_size": 65536 00:25:02.935 }, 00:25:02.935 { 00:25:02.935 "name": "BaseBdev3", 00:25:02.935 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:25:02.935 "is_configured": true, 00:25:02.935 "data_offset": 0, 00:25:02.935 "data_size": 65536 00:25:02.935 }, 00:25:02.935 { 00:25:02.935 "name": "BaseBdev4", 00:25:02.935 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:25:02.935 "is_configured": true, 00:25:02.935 "data_offset": 0, 00:25:02.935 "data_size": 65536 00:25:02.935 } 00:25:02.935 ] 00:25:02.935 }' 00:25:02.935 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.194 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:03.194 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.194 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:03.194 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:03.452 [2024-07-12 13:50:51.816833] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:03.452 [2024-07-12 13:50:51.821516] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd49ff0 00:25:03.452 [2024-07-12 13:50:51.823077] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:03.452 13:50:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:04.389 13:50:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:04.389 13:50:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:04.389 13:50:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:04.389 13:50:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:04.389 13:50:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:04.389 13:50:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.389 13:50:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.653 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:04.653 "name": "raid_bdev1", 00:25:04.653 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:25:04.653 "strip_size_kb": 0, 00:25:04.653 "state": "online", 00:25:04.653 "raid_level": "raid1", 00:25:04.653 "superblock": false, 00:25:04.653 "num_base_bdevs": 4, 00:25:04.653 "num_base_bdevs_discovered": 4, 00:25:04.653 "num_base_bdevs_operational": 4, 00:25:04.653 "process": { 00:25:04.653 "type": "rebuild", 00:25:04.653 "target": "spare", 00:25:04.653 "progress": { 00:25:04.653 "blocks": 24576, 00:25:04.653 "percent": 37 00:25:04.653 } 00:25:04.653 }, 00:25:04.653 "base_bdevs_list": [ 00:25:04.653 { 00:25:04.653 "name": "spare", 00:25:04.653 "uuid": "26000567-c727-564f-9fe1-8efb313fc5aa", 00:25:04.653 "is_configured": true, 00:25:04.653 "data_offset": 0, 00:25:04.653 "data_size": 65536 00:25:04.653 }, 00:25:04.653 { 00:25:04.653 "name": "BaseBdev2", 00:25:04.653 "uuid": "e3f8406c-c611-542a-9beb-c750648d2e9a", 00:25:04.653 "is_configured": true, 00:25:04.653 "data_offset": 0, 00:25:04.653 "data_size": 65536 00:25:04.653 }, 00:25:04.653 { 00:25:04.653 "name": "BaseBdev3", 00:25:04.653 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:25:04.653 "is_configured": true, 00:25:04.653 "data_offset": 0, 00:25:04.653 "data_size": 65536 00:25:04.653 }, 00:25:04.653 { 00:25:04.653 "name": "BaseBdev4", 00:25:04.653 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:25:04.653 "is_configured": true, 00:25:04.653 "data_offset": 0, 00:25:04.653 "data_size": 65536 00:25:04.653 } 00:25:04.653 ] 00:25:04.653 }' 00:25:04.653 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:04.653 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:04.653 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:04.653 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:04.653 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:04.653 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:04.653 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:04.653 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:04.653 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:04.914 [2024-07-12 13:50:53.402736] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:04.914 [2024-07-12 13:50:53.435726] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xd49ff0 00:25:04.914 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:04.914 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:04.914 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:04.914 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:04.914 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:04.914 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:04.914 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:04.914 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.914 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:05.172 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:05.172 "name": "raid_bdev1", 00:25:05.172 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:25:05.172 "strip_size_kb": 0, 00:25:05.172 "state": "online", 00:25:05.172 "raid_level": "raid1", 00:25:05.172 "superblock": false, 00:25:05.172 "num_base_bdevs": 4, 00:25:05.172 "num_base_bdevs_discovered": 3, 00:25:05.172 "num_base_bdevs_operational": 3, 00:25:05.172 "process": { 00:25:05.172 "type": "rebuild", 00:25:05.172 "target": "spare", 00:25:05.172 "progress": { 00:25:05.172 "blocks": 36864, 00:25:05.172 "percent": 56 00:25:05.172 } 00:25:05.172 }, 00:25:05.172 "base_bdevs_list": [ 00:25:05.172 { 00:25:05.172 "name": "spare", 00:25:05.172 "uuid": "26000567-c727-564f-9fe1-8efb313fc5aa", 00:25:05.172 "is_configured": true, 00:25:05.172 "data_offset": 0, 00:25:05.172 "data_size": 65536 00:25:05.172 }, 00:25:05.172 { 00:25:05.172 "name": null, 00:25:05.172 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:05.172 "is_configured": false, 00:25:05.172 "data_offset": 0, 00:25:05.172 "data_size": 65536 00:25:05.172 }, 00:25:05.172 { 00:25:05.172 "name": "BaseBdev3", 00:25:05.172 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:25:05.172 "is_configured": true, 00:25:05.172 "data_offset": 0, 00:25:05.172 "data_size": 65536 00:25:05.172 }, 00:25:05.172 { 00:25:05.172 "name": "BaseBdev4", 00:25:05.172 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:25:05.172 "is_configured": true, 00:25:05.172 "data_offset": 0, 00:25:05.172 "data_size": 65536 00:25:05.172 } 00:25:05.172 ] 00:25:05.172 }' 00:25:05.172 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:05.172 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:05.172 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:05.430 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:05.431 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=917 00:25:05.431 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:05.431 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:05.431 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:05.431 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:05.431 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:05.431 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:05.431 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.431 13:50:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:05.689 13:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:05.689 "name": "raid_bdev1", 00:25:05.689 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:25:05.689 "strip_size_kb": 0, 00:25:05.689 "state": "online", 00:25:05.689 "raid_level": "raid1", 00:25:05.689 "superblock": false, 00:25:05.689 "num_base_bdevs": 4, 00:25:05.689 "num_base_bdevs_discovered": 3, 00:25:05.689 "num_base_bdevs_operational": 3, 00:25:05.689 "process": { 00:25:05.689 "type": "rebuild", 00:25:05.689 "target": "spare", 00:25:05.689 "progress": { 00:25:05.689 "blocks": 43008, 00:25:05.689 "percent": 65 00:25:05.689 } 00:25:05.689 }, 00:25:05.689 "base_bdevs_list": [ 00:25:05.689 { 00:25:05.689 "name": "spare", 00:25:05.689 "uuid": "26000567-c727-564f-9fe1-8efb313fc5aa", 00:25:05.689 "is_configured": true, 00:25:05.689 "data_offset": 0, 00:25:05.690 "data_size": 65536 00:25:05.690 }, 00:25:05.690 { 00:25:05.690 "name": null, 00:25:05.690 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:05.690 "is_configured": false, 00:25:05.690 "data_offset": 0, 00:25:05.690 "data_size": 65536 00:25:05.690 }, 00:25:05.690 { 00:25:05.690 "name": "BaseBdev3", 00:25:05.690 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:25:05.690 "is_configured": true, 00:25:05.690 "data_offset": 0, 00:25:05.690 "data_size": 65536 00:25:05.690 }, 00:25:05.690 { 00:25:05.690 "name": "BaseBdev4", 00:25:05.690 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:25:05.690 "is_configured": true, 00:25:05.690 "data_offset": 0, 00:25:05.690 "data_size": 65536 00:25:05.690 } 00:25:05.690 ] 00:25:05.690 }' 00:25:05.690 13:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:05.690 13:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:05.690 13:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:05.690 13:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:05.690 13:50:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:06.625 [2024-07-12 13:50:55.048222] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:06.625 [2024-07-12 13:50:55.048290] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:06.625 [2024-07-12 13:50:55.048328] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:06.625 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:06.625 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:06.625 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:06.625 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:06.625 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:06.625 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:06.625 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.625 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.884 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:06.884 "name": "raid_bdev1", 00:25:06.884 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:25:06.884 "strip_size_kb": 0, 00:25:06.884 "state": "online", 00:25:06.884 "raid_level": "raid1", 00:25:06.884 "superblock": false, 00:25:06.884 "num_base_bdevs": 4, 00:25:06.884 "num_base_bdevs_discovered": 3, 00:25:06.884 "num_base_bdevs_operational": 3, 00:25:06.884 "base_bdevs_list": [ 00:25:06.884 { 00:25:06.884 "name": "spare", 00:25:06.884 "uuid": "26000567-c727-564f-9fe1-8efb313fc5aa", 00:25:06.884 "is_configured": true, 00:25:06.884 "data_offset": 0, 00:25:06.884 "data_size": 65536 00:25:06.884 }, 00:25:06.884 { 00:25:06.884 "name": null, 00:25:06.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:06.884 "is_configured": false, 00:25:06.884 "data_offset": 0, 00:25:06.884 "data_size": 65536 00:25:06.884 }, 00:25:06.884 { 00:25:06.884 "name": "BaseBdev3", 00:25:06.884 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:25:06.884 "is_configured": true, 00:25:06.884 "data_offset": 0, 00:25:06.884 "data_size": 65536 00:25:06.884 }, 00:25:06.884 { 00:25:06.884 "name": "BaseBdev4", 00:25:06.884 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:25:06.884 "is_configured": true, 00:25:06.884 "data_offset": 0, 00:25:06.884 "data_size": 65536 00:25:06.884 } 00:25:06.884 ] 00:25:06.884 }' 00:25:06.884 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:06.884 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:06.884 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:07.142 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:07.142 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:25:07.142 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:07.142 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:07.142 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:07.142 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:07.142 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:07.142 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.142 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.400 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:07.400 "name": "raid_bdev1", 00:25:07.400 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:25:07.400 "strip_size_kb": 0, 00:25:07.400 "state": "online", 00:25:07.400 "raid_level": "raid1", 00:25:07.400 "superblock": false, 00:25:07.400 "num_base_bdevs": 4, 00:25:07.400 "num_base_bdevs_discovered": 3, 00:25:07.400 "num_base_bdevs_operational": 3, 00:25:07.400 "base_bdevs_list": [ 00:25:07.400 { 00:25:07.400 "name": "spare", 00:25:07.400 "uuid": "26000567-c727-564f-9fe1-8efb313fc5aa", 00:25:07.400 "is_configured": true, 00:25:07.400 "data_offset": 0, 00:25:07.400 "data_size": 65536 00:25:07.400 }, 00:25:07.400 { 00:25:07.400 "name": null, 00:25:07.400 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.400 "is_configured": false, 00:25:07.400 "data_offset": 0, 00:25:07.400 "data_size": 65536 00:25:07.400 }, 00:25:07.400 { 00:25:07.401 "name": "BaseBdev3", 00:25:07.401 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:25:07.401 "is_configured": true, 00:25:07.401 "data_offset": 0, 00:25:07.401 "data_size": 65536 00:25:07.401 }, 00:25:07.401 { 00:25:07.401 "name": "BaseBdev4", 00:25:07.401 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:25:07.401 "is_configured": true, 00:25:07.401 "data_offset": 0, 00:25:07.401 "data_size": 65536 00:25:07.401 } 00:25:07.401 ] 00:25:07.401 }' 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.401 13:50:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.659 13:50:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.659 "name": "raid_bdev1", 00:25:07.659 "uuid": "80823e56-a08e-4fbc-b370-f7808812cfb2", 00:25:07.659 "strip_size_kb": 0, 00:25:07.659 "state": "online", 00:25:07.659 "raid_level": "raid1", 00:25:07.659 "superblock": false, 00:25:07.659 "num_base_bdevs": 4, 00:25:07.659 "num_base_bdevs_discovered": 3, 00:25:07.659 "num_base_bdevs_operational": 3, 00:25:07.659 "base_bdevs_list": [ 00:25:07.659 { 00:25:07.659 "name": "spare", 00:25:07.659 "uuid": "26000567-c727-564f-9fe1-8efb313fc5aa", 00:25:07.659 "is_configured": true, 00:25:07.659 "data_offset": 0, 00:25:07.659 "data_size": 65536 00:25:07.659 }, 00:25:07.659 { 00:25:07.659 "name": null, 00:25:07.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.659 "is_configured": false, 00:25:07.659 "data_offset": 0, 00:25:07.659 "data_size": 65536 00:25:07.659 }, 00:25:07.659 { 00:25:07.659 "name": "BaseBdev3", 00:25:07.659 "uuid": "3baa5622-0708-537f-bf0f-74517f630007", 00:25:07.659 "is_configured": true, 00:25:07.659 "data_offset": 0, 00:25:07.659 "data_size": 65536 00:25:07.659 }, 00:25:07.659 { 00:25:07.659 "name": "BaseBdev4", 00:25:07.659 "uuid": "e72aaead-94b1-5774-a977-15402323d285", 00:25:07.659 "is_configured": true, 00:25:07.659 "data_offset": 0, 00:25:07.659 "data_size": 65536 00:25:07.659 } 00:25:07.659 ] 00:25:07.659 }' 00:25:07.659 13:50:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.659 13:50:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:08.225 13:50:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:08.484 [2024-07-12 13:50:56.897746] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:08.484 [2024-07-12 13:50:56.897778] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:08.484 [2024-07-12 13:50:56.897843] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:08.484 [2024-07-12 13:50:56.897914] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:08.485 [2024-07-12 13:50:56.897932] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd441e0 name raid_bdev1, state offline 00:25:08.485 13:50:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.485 13:50:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:08.743 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:09.001 /dev/nbd0 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:09.001 1+0 records in 00:25:09.001 1+0 records out 00:25:09.001 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228768 s, 17.9 MB/s 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:09.001 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:09.261 /dev/nbd1 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:09.261 1+0 records in 00:25:09.261 1+0 records out 00:25:09.261 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313996 s, 13.0 MB/s 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:09.261 13:50:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:09.519 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:09.778 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:09.778 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:09.778 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:09.778 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:09.778 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:09.778 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:09.778 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:09.778 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:09.778 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 553759 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 553759 ']' 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 553759 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 553759 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 553759' 00:25:10.038 killing process with pid 553759 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 553759 00:25:10.038 Received shutdown signal, test time was about 60.000000 seconds 00:25:10.038 00:25:10.038 Latency(us) 00:25:10.038 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:10.038 =================================================================================================================== 00:25:10.038 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:10.038 [2024-07-12 13:50:58.436432] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:10.038 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 553759 00:25:10.038 [2024-07-12 13:50:58.484800] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:25:10.298 00:25:10.298 real 0m24.841s 00:25:10.298 user 0m32.449s 00:25:10.298 sys 0m5.969s 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:10.298 ************************************ 00:25:10.298 END TEST raid_rebuild_test 00:25:10.298 ************************************ 00:25:10.298 13:50:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:10.298 13:50:58 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:25:10.298 13:50:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:10.298 13:50:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:10.298 13:50:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:10.298 ************************************ 00:25:10.298 START TEST raid_rebuild_test_sb 00:25:10.298 ************************************ 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:10.298 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=557149 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 557149 /var/tmp/spdk-raid.sock 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 557149 ']' 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:10.299 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:10.299 13:50:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:10.299 [2024-07-12 13:50:58.873703] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:25:10.299 [2024-07-12 13:50:58.873772] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid557149 ] 00:25:10.299 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:10.299 Zero copy mechanism will not be used. 00:25:10.558 [2024-07-12 13:50:59.004121] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:10.558 [2024-07-12 13:50:59.107979] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:10.818 [2024-07-12 13:50:59.166701] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:10.818 [2024-07-12 13:50:59.166734] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:11.386 13:50:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:11.386 13:50:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:25:11.386 13:50:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:11.386 13:50:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:11.645 BaseBdev1_malloc 00:25:11.645 13:51:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:11.904 [2024-07-12 13:51:00.231316] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:11.904 [2024-07-12 13:51:00.231375] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:11.904 [2024-07-12 13:51:00.231400] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1751680 00:25:11.904 [2024-07-12 13:51:00.231412] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:11.904 [2024-07-12 13:51:00.233109] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:11.904 [2024-07-12 13:51:00.233136] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:11.904 BaseBdev1 00:25:11.904 13:51:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:11.904 13:51:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:12.163 BaseBdev2_malloc 00:25:12.163 13:51:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:12.163 [2024-07-12 13:51:00.733497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:12.163 [2024-07-12 13:51:00.733542] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:12.163 [2024-07-12 13:51:00.733564] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17521a0 00:25:12.163 [2024-07-12 13:51:00.733576] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:12.163 [2024-07-12 13:51:00.735050] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:12.163 [2024-07-12 13:51:00.735080] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:12.163 BaseBdev2 00:25:12.421 13:51:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:12.421 13:51:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:12.421 BaseBdev3_malloc 00:25:12.421 13:51:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:12.680 [2024-07-12 13:51:01.175320] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:12.680 [2024-07-12 13:51:01.175373] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:12.680 [2024-07-12 13:51:01.175394] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18ff230 00:25:12.680 [2024-07-12 13:51:01.175406] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:12.680 [2024-07-12 13:51:01.176985] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:12.680 [2024-07-12 13:51:01.177015] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:12.680 BaseBdev3 00:25:12.680 13:51:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:12.680 13:51:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:12.938 BaseBdev4_malloc 00:25:12.938 13:51:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:13.195 [2024-07-12 13:51:01.677296] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:13.195 [2024-07-12 13:51:01.677345] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:13.195 [2024-07-12 13:51:01.677364] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fe410 00:25:13.195 [2024-07-12 13:51:01.677377] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:13.195 [2024-07-12 13:51:01.678777] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:13.195 [2024-07-12 13:51:01.678810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:13.195 BaseBdev4 00:25:13.195 13:51:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:13.453 spare_malloc 00:25:13.453 13:51:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:13.712 spare_delay 00:25:13.712 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:13.970 [2024-07-12 13:51:02.431912] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:13.970 [2024-07-12 13:51:02.431964] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:13.970 [2024-07-12 13:51:02.431983] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1902ef0 00:25:13.970 [2024-07-12 13:51:02.431995] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:13.970 [2024-07-12 13:51:02.433457] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:13.970 [2024-07-12 13:51:02.433485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:13.970 spare 00:25:13.970 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:14.228 [2024-07-12 13:51:02.680599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:14.228 [2024-07-12 13:51:02.681760] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:14.228 [2024-07-12 13:51:02.681813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:14.228 [2024-07-12 13:51:02.681858] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:14.228 [2024-07-12 13:51:02.682054] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18821e0 00:25:14.228 [2024-07-12 13:51:02.682066] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:14.228 [2024-07-12 13:51:02.682255] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fc750 00:25:14.228 [2024-07-12 13:51:02.682399] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18821e0 00:25:14.228 [2024-07-12 13:51:02.682410] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18821e0 00:25:14.228 [2024-07-12 13:51:02.682499] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:14.228 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:14.228 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:14.228 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:14.228 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:14.229 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:14.229 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:14.229 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:14.229 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:14.229 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:14.229 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:14.229 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.229 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.487 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:14.487 "name": "raid_bdev1", 00:25:14.487 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:14.487 "strip_size_kb": 0, 00:25:14.487 "state": "online", 00:25:14.487 "raid_level": "raid1", 00:25:14.487 "superblock": true, 00:25:14.487 "num_base_bdevs": 4, 00:25:14.487 "num_base_bdevs_discovered": 4, 00:25:14.487 "num_base_bdevs_operational": 4, 00:25:14.487 "base_bdevs_list": [ 00:25:14.487 { 00:25:14.487 "name": "BaseBdev1", 00:25:14.487 "uuid": "69ec5716-bf90-524b-a157-8b7a572c73c2", 00:25:14.487 "is_configured": true, 00:25:14.487 "data_offset": 2048, 00:25:14.487 "data_size": 63488 00:25:14.487 }, 00:25:14.487 { 00:25:14.487 "name": "BaseBdev2", 00:25:14.487 "uuid": "cb907cf1-12d7-55bb-b8a7-ecc857528842", 00:25:14.487 "is_configured": true, 00:25:14.487 "data_offset": 2048, 00:25:14.487 "data_size": 63488 00:25:14.487 }, 00:25:14.487 { 00:25:14.487 "name": "BaseBdev3", 00:25:14.487 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:14.487 "is_configured": true, 00:25:14.487 "data_offset": 2048, 00:25:14.487 "data_size": 63488 00:25:14.487 }, 00:25:14.487 { 00:25:14.487 "name": "BaseBdev4", 00:25:14.487 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:14.487 "is_configured": true, 00:25:14.487 "data_offset": 2048, 00:25:14.487 "data_size": 63488 00:25:14.487 } 00:25:14.487 ] 00:25:14.487 }' 00:25:14.488 13:51:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:14.488 13:51:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:15.052 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:15.052 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:15.310 [2024-07-12 13:51:03.784046] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:15.310 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:15.310 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.310 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:15.568 13:51:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:15.827 [2024-07-12 13:51:04.220942] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fc750 00:25:15.827 /dev/nbd0 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.827 1+0 records in 00:25:15.827 1+0 records out 00:25:15.827 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258118 s, 15.9 MB/s 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:15.827 13:51:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:25:23.943 63488+0 records in 00:25:23.943 63488+0 records out 00:25:23.944 32505856 bytes (33 MB, 31 MiB) copied, 7.19256 s, 4.5 MB/s 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:23.944 [2024-07-12 13:51:11.681882] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:23.944 [2024-07-12 13:51:11.909183] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.944 13:51:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.944 13:51:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:23.944 "name": "raid_bdev1", 00:25:23.944 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:23.944 "strip_size_kb": 0, 00:25:23.944 "state": "online", 00:25:23.944 "raid_level": "raid1", 00:25:23.944 "superblock": true, 00:25:23.944 "num_base_bdevs": 4, 00:25:23.944 "num_base_bdevs_discovered": 3, 00:25:23.944 "num_base_bdevs_operational": 3, 00:25:23.944 "base_bdevs_list": [ 00:25:23.944 { 00:25:23.944 "name": null, 00:25:23.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.944 "is_configured": false, 00:25:23.944 "data_offset": 2048, 00:25:23.944 "data_size": 63488 00:25:23.944 }, 00:25:23.944 { 00:25:23.944 "name": "BaseBdev2", 00:25:23.944 "uuid": "cb907cf1-12d7-55bb-b8a7-ecc857528842", 00:25:23.944 "is_configured": true, 00:25:23.944 "data_offset": 2048, 00:25:23.944 "data_size": 63488 00:25:23.944 }, 00:25:23.944 { 00:25:23.944 "name": "BaseBdev3", 00:25:23.944 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:23.944 "is_configured": true, 00:25:23.944 "data_offset": 2048, 00:25:23.944 "data_size": 63488 00:25:23.944 }, 00:25:23.944 { 00:25:23.944 "name": "BaseBdev4", 00:25:23.944 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:23.944 "is_configured": true, 00:25:23.944 "data_offset": 2048, 00:25:23.944 "data_size": 63488 00:25:23.944 } 00:25:23.944 ] 00:25:23.944 }' 00:25:23.944 13:51:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:23.944 13:51:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:24.512 13:51:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:24.513 [2024-07-12 13:51:13.016140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:24.513 [2024-07-12 13:51:13.020524] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fc750 00:25:24.513 [2024-07-12 13:51:13.022908] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:24.513 13:51:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:25.890 "name": "raid_bdev1", 00:25:25.890 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:25.890 "strip_size_kb": 0, 00:25:25.890 "state": "online", 00:25:25.890 "raid_level": "raid1", 00:25:25.890 "superblock": true, 00:25:25.890 "num_base_bdevs": 4, 00:25:25.890 "num_base_bdevs_discovered": 4, 00:25:25.890 "num_base_bdevs_operational": 4, 00:25:25.890 "process": { 00:25:25.890 "type": "rebuild", 00:25:25.890 "target": "spare", 00:25:25.890 "progress": { 00:25:25.890 "blocks": 24576, 00:25:25.890 "percent": 38 00:25:25.890 } 00:25:25.890 }, 00:25:25.890 "base_bdevs_list": [ 00:25:25.890 { 00:25:25.890 "name": "spare", 00:25:25.890 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:25.890 "is_configured": true, 00:25:25.890 "data_offset": 2048, 00:25:25.890 "data_size": 63488 00:25:25.890 }, 00:25:25.890 { 00:25:25.890 "name": "BaseBdev2", 00:25:25.890 "uuid": "cb907cf1-12d7-55bb-b8a7-ecc857528842", 00:25:25.890 "is_configured": true, 00:25:25.890 "data_offset": 2048, 00:25:25.890 "data_size": 63488 00:25:25.890 }, 00:25:25.890 { 00:25:25.890 "name": "BaseBdev3", 00:25:25.890 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:25.890 "is_configured": true, 00:25:25.890 "data_offset": 2048, 00:25:25.890 "data_size": 63488 00:25:25.890 }, 00:25:25.890 { 00:25:25.890 "name": "BaseBdev4", 00:25:25.890 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:25.890 "is_configured": true, 00:25:25.890 "data_offset": 2048, 00:25:25.890 "data_size": 63488 00:25:25.890 } 00:25:25.890 ] 00:25:25.890 }' 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:25.890 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:26.149 [2024-07-12 13:51:14.605259] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:26.149 [2024-07-12 13:51:14.635603] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:26.149 [2024-07-12 13:51:14.635649] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:26.149 [2024-07-12 13:51:14.635667] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:26.149 [2024-07-12 13:51:14.635675] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:26.149 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.150 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.409 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.409 "name": "raid_bdev1", 00:25:26.409 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:26.409 "strip_size_kb": 0, 00:25:26.409 "state": "online", 00:25:26.409 "raid_level": "raid1", 00:25:26.409 "superblock": true, 00:25:26.409 "num_base_bdevs": 4, 00:25:26.409 "num_base_bdevs_discovered": 3, 00:25:26.409 "num_base_bdevs_operational": 3, 00:25:26.409 "base_bdevs_list": [ 00:25:26.409 { 00:25:26.409 "name": null, 00:25:26.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.409 "is_configured": false, 00:25:26.409 "data_offset": 2048, 00:25:26.409 "data_size": 63488 00:25:26.409 }, 00:25:26.409 { 00:25:26.409 "name": "BaseBdev2", 00:25:26.409 "uuid": "cb907cf1-12d7-55bb-b8a7-ecc857528842", 00:25:26.409 "is_configured": true, 00:25:26.409 "data_offset": 2048, 00:25:26.409 "data_size": 63488 00:25:26.409 }, 00:25:26.409 { 00:25:26.409 "name": "BaseBdev3", 00:25:26.409 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:26.409 "is_configured": true, 00:25:26.409 "data_offset": 2048, 00:25:26.409 "data_size": 63488 00:25:26.409 }, 00:25:26.409 { 00:25:26.409 "name": "BaseBdev4", 00:25:26.409 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:26.409 "is_configured": true, 00:25:26.409 "data_offset": 2048, 00:25:26.409 "data_size": 63488 00:25:26.409 } 00:25:26.409 ] 00:25:26.409 }' 00:25:26.409 13:51:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.409 13:51:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:26.978 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:26.978 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:26.978 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:26.978 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:26.978 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:26.978 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.978 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.237 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:27.237 "name": "raid_bdev1", 00:25:27.237 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:27.237 "strip_size_kb": 0, 00:25:27.237 "state": "online", 00:25:27.237 "raid_level": "raid1", 00:25:27.237 "superblock": true, 00:25:27.237 "num_base_bdevs": 4, 00:25:27.237 "num_base_bdevs_discovered": 3, 00:25:27.237 "num_base_bdevs_operational": 3, 00:25:27.237 "base_bdevs_list": [ 00:25:27.237 { 00:25:27.237 "name": null, 00:25:27.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.237 "is_configured": false, 00:25:27.237 "data_offset": 2048, 00:25:27.237 "data_size": 63488 00:25:27.237 }, 00:25:27.237 { 00:25:27.237 "name": "BaseBdev2", 00:25:27.237 "uuid": "cb907cf1-12d7-55bb-b8a7-ecc857528842", 00:25:27.237 "is_configured": true, 00:25:27.237 "data_offset": 2048, 00:25:27.237 "data_size": 63488 00:25:27.237 }, 00:25:27.237 { 00:25:27.237 "name": "BaseBdev3", 00:25:27.237 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:27.237 "is_configured": true, 00:25:27.237 "data_offset": 2048, 00:25:27.237 "data_size": 63488 00:25:27.237 }, 00:25:27.237 { 00:25:27.237 "name": "BaseBdev4", 00:25:27.237 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:27.237 "is_configured": true, 00:25:27.237 "data_offset": 2048, 00:25:27.237 "data_size": 63488 00:25:27.237 } 00:25:27.237 ] 00:25:27.237 }' 00:25:27.237 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:27.237 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:27.237 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:27.495 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:27.495 13:51:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:27.754 [2024-07-12 13:51:16.083473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:27.754 [2024-07-12 13:51:16.088020] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18827d0 00:25:27.754 [2024-07-12 13:51:16.089560] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:27.754 13:51:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:28.690 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:28.690 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:28.690 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:28.690 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:28.690 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:28.690 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.690 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:28.950 "name": "raid_bdev1", 00:25:28.950 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:28.950 "strip_size_kb": 0, 00:25:28.950 "state": "online", 00:25:28.950 "raid_level": "raid1", 00:25:28.950 "superblock": true, 00:25:28.950 "num_base_bdevs": 4, 00:25:28.950 "num_base_bdevs_discovered": 4, 00:25:28.950 "num_base_bdevs_operational": 4, 00:25:28.950 "process": { 00:25:28.950 "type": "rebuild", 00:25:28.950 "target": "spare", 00:25:28.950 "progress": { 00:25:28.950 "blocks": 24576, 00:25:28.950 "percent": 38 00:25:28.950 } 00:25:28.950 }, 00:25:28.950 "base_bdevs_list": [ 00:25:28.950 { 00:25:28.950 "name": "spare", 00:25:28.950 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:28.950 "is_configured": true, 00:25:28.950 "data_offset": 2048, 00:25:28.950 "data_size": 63488 00:25:28.950 }, 00:25:28.950 { 00:25:28.950 "name": "BaseBdev2", 00:25:28.950 "uuid": "cb907cf1-12d7-55bb-b8a7-ecc857528842", 00:25:28.950 "is_configured": true, 00:25:28.950 "data_offset": 2048, 00:25:28.950 "data_size": 63488 00:25:28.950 }, 00:25:28.950 { 00:25:28.950 "name": "BaseBdev3", 00:25:28.950 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:28.950 "is_configured": true, 00:25:28.950 "data_offset": 2048, 00:25:28.950 "data_size": 63488 00:25:28.950 }, 00:25:28.950 { 00:25:28.950 "name": "BaseBdev4", 00:25:28.950 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:28.950 "is_configured": true, 00:25:28.950 "data_offset": 2048, 00:25:28.950 "data_size": 63488 00:25:28.950 } 00:25:28.950 ] 00:25:28.950 }' 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:28.950 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:28.950 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:29.209 [2024-07-12 13:51:17.661365] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:29.468 [2024-07-12 13:51:17.802225] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x18827d0 00:25:29.468 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:29.468 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:29.468 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:29.468 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.468 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:29.468 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:29.468 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.468 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.468 13:51:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.728 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.728 "name": "raid_bdev1", 00:25:29.728 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:29.728 "strip_size_kb": 0, 00:25:29.728 "state": "online", 00:25:29.728 "raid_level": "raid1", 00:25:29.728 "superblock": true, 00:25:29.728 "num_base_bdevs": 4, 00:25:29.728 "num_base_bdevs_discovered": 3, 00:25:29.728 "num_base_bdevs_operational": 3, 00:25:29.728 "process": { 00:25:29.729 "type": "rebuild", 00:25:29.729 "target": "spare", 00:25:29.729 "progress": { 00:25:29.729 "blocks": 36864, 00:25:29.729 "percent": 58 00:25:29.729 } 00:25:29.729 }, 00:25:29.729 "base_bdevs_list": [ 00:25:29.729 { 00:25:29.729 "name": "spare", 00:25:29.729 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:29.729 "is_configured": true, 00:25:29.729 "data_offset": 2048, 00:25:29.729 "data_size": 63488 00:25:29.729 }, 00:25:29.729 { 00:25:29.729 "name": null, 00:25:29.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.729 "is_configured": false, 00:25:29.729 "data_offset": 2048, 00:25:29.729 "data_size": 63488 00:25:29.729 }, 00:25:29.729 { 00:25:29.729 "name": "BaseBdev3", 00:25:29.729 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:29.729 "is_configured": true, 00:25:29.729 "data_offset": 2048, 00:25:29.729 "data_size": 63488 00:25:29.729 }, 00:25:29.729 { 00:25:29.729 "name": "BaseBdev4", 00:25:29.729 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:29.729 "is_configured": true, 00:25:29.729 "data_offset": 2048, 00:25:29.729 "data_size": 63488 00:25:29.729 } 00:25:29.729 ] 00:25:29.729 }' 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=942 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.729 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.988 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.988 "name": "raid_bdev1", 00:25:29.988 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:29.988 "strip_size_kb": 0, 00:25:29.988 "state": "online", 00:25:29.988 "raid_level": "raid1", 00:25:29.988 "superblock": true, 00:25:29.988 "num_base_bdevs": 4, 00:25:29.988 "num_base_bdevs_discovered": 3, 00:25:29.988 "num_base_bdevs_operational": 3, 00:25:29.988 "process": { 00:25:29.988 "type": "rebuild", 00:25:29.988 "target": "spare", 00:25:29.988 "progress": { 00:25:29.988 "blocks": 43008, 00:25:29.988 "percent": 67 00:25:29.988 } 00:25:29.988 }, 00:25:29.988 "base_bdevs_list": [ 00:25:29.988 { 00:25:29.988 "name": "spare", 00:25:29.988 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:29.988 "is_configured": true, 00:25:29.988 "data_offset": 2048, 00:25:29.988 "data_size": 63488 00:25:29.988 }, 00:25:29.988 { 00:25:29.988 "name": null, 00:25:29.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.988 "is_configured": false, 00:25:29.988 "data_offset": 2048, 00:25:29.988 "data_size": 63488 00:25:29.988 }, 00:25:29.988 { 00:25:29.988 "name": "BaseBdev3", 00:25:29.988 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:29.988 "is_configured": true, 00:25:29.988 "data_offset": 2048, 00:25:29.988 "data_size": 63488 00:25:29.988 }, 00:25:29.988 { 00:25:29.988 "name": "BaseBdev4", 00:25:29.988 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:29.988 "is_configured": true, 00:25:29.988 "data_offset": 2048, 00:25:29.988 "data_size": 63488 00:25:29.988 } 00:25:29.988 ] 00:25:29.988 }' 00:25:29.988 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.988 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:29.988 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.988 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:29.988 13:51:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:30.932 [2024-07-12 13:51:19.313862] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:30.932 [2024-07-12 13:51:19.313922] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:30.932 [2024-07-12 13:51:19.314023] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:30.932 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:30.932 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:30.932 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:30.932 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:30.932 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:30.932 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.190 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.190 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.190 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:31.190 "name": "raid_bdev1", 00:25:31.190 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:31.190 "strip_size_kb": 0, 00:25:31.190 "state": "online", 00:25:31.190 "raid_level": "raid1", 00:25:31.190 "superblock": true, 00:25:31.190 "num_base_bdevs": 4, 00:25:31.190 "num_base_bdevs_discovered": 3, 00:25:31.190 "num_base_bdevs_operational": 3, 00:25:31.190 "base_bdevs_list": [ 00:25:31.190 { 00:25:31.190 "name": "spare", 00:25:31.190 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:31.190 "is_configured": true, 00:25:31.190 "data_offset": 2048, 00:25:31.190 "data_size": 63488 00:25:31.190 }, 00:25:31.190 { 00:25:31.190 "name": null, 00:25:31.190 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:31.190 "is_configured": false, 00:25:31.190 "data_offset": 2048, 00:25:31.190 "data_size": 63488 00:25:31.190 }, 00:25:31.190 { 00:25:31.190 "name": "BaseBdev3", 00:25:31.190 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:31.190 "is_configured": true, 00:25:31.190 "data_offset": 2048, 00:25:31.190 "data_size": 63488 00:25:31.190 }, 00:25:31.190 { 00:25:31.190 "name": "BaseBdev4", 00:25:31.190 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:31.190 "is_configured": true, 00:25:31.190 "data_offset": 2048, 00:25:31.190 "data_size": 63488 00:25:31.190 } 00:25:31.190 ] 00:25:31.190 }' 00:25:31.190 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.448 13:51:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.706 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:31.706 "name": "raid_bdev1", 00:25:31.706 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:31.706 "strip_size_kb": 0, 00:25:31.706 "state": "online", 00:25:31.706 "raid_level": "raid1", 00:25:31.706 "superblock": true, 00:25:31.706 "num_base_bdevs": 4, 00:25:31.706 "num_base_bdevs_discovered": 3, 00:25:31.706 "num_base_bdevs_operational": 3, 00:25:31.706 "base_bdevs_list": [ 00:25:31.706 { 00:25:31.706 "name": "spare", 00:25:31.706 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:31.706 "is_configured": true, 00:25:31.706 "data_offset": 2048, 00:25:31.706 "data_size": 63488 00:25:31.706 }, 00:25:31.706 { 00:25:31.706 "name": null, 00:25:31.706 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:31.706 "is_configured": false, 00:25:31.706 "data_offset": 2048, 00:25:31.706 "data_size": 63488 00:25:31.706 }, 00:25:31.706 { 00:25:31.706 "name": "BaseBdev3", 00:25:31.706 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:31.706 "is_configured": true, 00:25:31.706 "data_offset": 2048, 00:25:31.706 "data_size": 63488 00:25:31.706 }, 00:25:31.706 { 00:25:31.706 "name": "BaseBdev4", 00:25:31.706 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:31.706 "is_configured": true, 00:25:31.706 "data_offset": 2048, 00:25:31.706 "data_size": 63488 00:25:31.706 } 00:25:31.706 ] 00:25:31.706 }' 00:25:31.706 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.706 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:31.706 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.706 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:31.706 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:31.706 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:31.707 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:31.707 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:31.707 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:31.707 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:31.707 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:31.707 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:31.707 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:31.707 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:31.707 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.707 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.965 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:31.965 "name": "raid_bdev1", 00:25:31.965 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:31.965 "strip_size_kb": 0, 00:25:31.965 "state": "online", 00:25:31.965 "raid_level": "raid1", 00:25:31.965 "superblock": true, 00:25:31.965 "num_base_bdevs": 4, 00:25:31.965 "num_base_bdevs_discovered": 3, 00:25:31.965 "num_base_bdevs_operational": 3, 00:25:31.965 "base_bdevs_list": [ 00:25:31.965 { 00:25:31.965 "name": "spare", 00:25:31.965 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:31.965 "is_configured": true, 00:25:31.965 "data_offset": 2048, 00:25:31.965 "data_size": 63488 00:25:31.965 }, 00:25:31.965 { 00:25:31.965 "name": null, 00:25:31.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:31.965 "is_configured": false, 00:25:31.965 "data_offset": 2048, 00:25:31.965 "data_size": 63488 00:25:31.965 }, 00:25:31.965 { 00:25:31.965 "name": "BaseBdev3", 00:25:31.965 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:31.965 "is_configured": true, 00:25:31.965 "data_offset": 2048, 00:25:31.965 "data_size": 63488 00:25:31.965 }, 00:25:31.965 { 00:25:31.965 "name": "BaseBdev4", 00:25:31.965 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:31.965 "is_configured": true, 00:25:31.965 "data_offset": 2048, 00:25:31.965 "data_size": 63488 00:25:31.965 } 00:25:31.965 ] 00:25:31.965 }' 00:25:31.965 13:51:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:31.965 13:51:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:32.532 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:32.791 [2024-07-12 13:51:21.283213] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:32.791 [2024-07-12 13:51:21.283243] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:32.791 [2024-07-12 13:51:21.283304] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:32.791 [2024-07-12 13:51:21.283375] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:32.791 [2024-07-12 13:51:21.283386] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18821e0 name raid_bdev1, state offline 00:25:32.791 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.791 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:33.051 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:33.312 /dev/nbd0 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:33.312 1+0 records in 00:25:33.312 1+0 records out 00:25:33.312 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002551 s, 16.1 MB/s 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:33.312 13:51:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:33.627 /dev/nbd1 00:25:33.627 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:33.627 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:33.627 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:33.627 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:33.627 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:33.627 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:33.627 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:33.627 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:33.628 1+0 records in 00:25:33.628 1+0 records out 00:25:33.628 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308497 s, 13.3 MB/s 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:33.628 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:33.950 13:51:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:34.620 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:34.620 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:34.620 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:34.620 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:34.620 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:34.620 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:34.620 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:34.620 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:34.620 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:34.620 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:34.882 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:35.141 [2024-07-12 13:51:23.494732] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:35.141 [2024-07-12 13:51:23.494785] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:35.141 [2024-07-12 13:51:23.494807] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18fc480 00:25:35.141 [2024-07-12 13:51:23.494820] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:35.141 [2024-07-12 13:51:23.496538] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:35.141 [2024-07-12 13:51:23.496569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:35.141 [2024-07-12 13:51:23.496660] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:35.141 [2024-07-12 13:51:23.496687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:35.141 [2024-07-12 13:51:23.496798] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:35.141 [2024-07-12 13:51:23.496874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:35.141 spare 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.141 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.141 [2024-07-12 13:51:23.597204] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18864e0 00:25:35.141 [2024-07-12 13:51:23.597227] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:35.141 [2024-07-12 13:51:23.597457] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fbc30 00:25:35.141 [2024-07-12 13:51:23.597631] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18864e0 00:25:35.141 [2024-07-12 13:51:23.597641] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18864e0 00:25:35.141 [2024-07-12 13:51:23.597754] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:35.400 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:35.400 "name": "raid_bdev1", 00:25:35.400 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:35.400 "strip_size_kb": 0, 00:25:35.400 "state": "online", 00:25:35.400 "raid_level": "raid1", 00:25:35.400 "superblock": true, 00:25:35.400 "num_base_bdevs": 4, 00:25:35.400 "num_base_bdevs_discovered": 3, 00:25:35.400 "num_base_bdevs_operational": 3, 00:25:35.400 "base_bdevs_list": [ 00:25:35.400 { 00:25:35.400 "name": "spare", 00:25:35.400 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:35.400 "is_configured": true, 00:25:35.400 "data_offset": 2048, 00:25:35.400 "data_size": 63488 00:25:35.400 }, 00:25:35.400 { 00:25:35.400 "name": null, 00:25:35.400 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.400 "is_configured": false, 00:25:35.400 "data_offset": 2048, 00:25:35.400 "data_size": 63488 00:25:35.400 }, 00:25:35.400 { 00:25:35.400 "name": "BaseBdev3", 00:25:35.400 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:35.400 "is_configured": true, 00:25:35.400 "data_offset": 2048, 00:25:35.400 "data_size": 63488 00:25:35.400 }, 00:25:35.400 { 00:25:35.400 "name": "BaseBdev4", 00:25:35.400 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:35.400 "is_configured": true, 00:25:35.400 "data_offset": 2048, 00:25:35.400 "data_size": 63488 00:25:35.400 } 00:25:35.400 ] 00:25:35.400 }' 00:25:35.400 13:51:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:35.400 13:51:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:35.968 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:35.968 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.968 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:35.968 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:35.968 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.968 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.968 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.968 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.968 "name": "raid_bdev1", 00:25:35.968 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:35.968 "strip_size_kb": 0, 00:25:35.968 "state": "online", 00:25:35.968 "raid_level": "raid1", 00:25:35.968 "superblock": true, 00:25:35.968 "num_base_bdevs": 4, 00:25:35.968 "num_base_bdevs_discovered": 3, 00:25:35.968 "num_base_bdevs_operational": 3, 00:25:35.968 "base_bdevs_list": [ 00:25:35.968 { 00:25:35.968 "name": "spare", 00:25:35.968 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:35.968 "is_configured": true, 00:25:35.968 "data_offset": 2048, 00:25:35.968 "data_size": 63488 00:25:35.968 }, 00:25:35.968 { 00:25:35.968 "name": null, 00:25:35.968 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.968 "is_configured": false, 00:25:35.968 "data_offset": 2048, 00:25:35.968 "data_size": 63488 00:25:35.968 }, 00:25:35.968 { 00:25:35.968 "name": "BaseBdev3", 00:25:35.968 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:35.969 "is_configured": true, 00:25:35.969 "data_offset": 2048, 00:25:35.969 "data_size": 63488 00:25:35.969 }, 00:25:35.969 { 00:25:35.969 "name": "BaseBdev4", 00:25:35.969 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:35.969 "is_configured": true, 00:25:35.969 "data_offset": 2048, 00:25:35.969 "data_size": 63488 00:25:35.969 } 00:25:35.969 ] 00:25:35.969 }' 00:25:35.969 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:36.227 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:36.227 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:36.227 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:36.227 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.227 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:36.486 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:36.486 13:51:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:36.745 [2024-07-12 13:51:25.107101] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.745 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.065 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:37.065 "name": "raid_bdev1", 00:25:37.065 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:37.065 "strip_size_kb": 0, 00:25:37.065 "state": "online", 00:25:37.065 "raid_level": "raid1", 00:25:37.065 "superblock": true, 00:25:37.065 "num_base_bdevs": 4, 00:25:37.065 "num_base_bdevs_discovered": 2, 00:25:37.065 "num_base_bdevs_operational": 2, 00:25:37.065 "base_bdevs_list": [ 00:25:37.065 { 00:25:37.065 "name": null, 00:25:37.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:37.065 "is_configured": false, 00:25:37.065 "data_offset": 2048, 00:25:37.065 "data_size": 63488 00:25:37.065 }, 00:25:37.065 { 00:25:37.065 "name": null, 00:25:37.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:37.065 "is_configured": false, 00:25:37.065 "data_offset": 2048, 00:25:37.065 "data_size": 63488 00:25:37.065 }, 00:25:37.065 { 00:25:37.065 "name": "BaseBdev3", 00:25:37.065 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:37.065 "is_configured": true, 00:25:37.065 "data_offset": 2048, 00:25:37.065 "data_size": 63488 00:25:37.065 }, 00:25:37.065 { 00:25:37.065 "name": "BaseBdev4", 00:25:37.065 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:37.065 "is_configured": true, 00:25:37.065 "data_offset": 2048, 00:25:37.065 "data_size": 63488 00:25:37.065 } 00:25:37.065 ] 00:25:37.065 }' 00:25:37.065 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:37.065 13:51:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:37.634 13:51:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:37.634 [2024-07-12 13:51:26.202386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:37.634 [2024-07-12 13:51:26.202549] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:37.634 [2024-07-12 13:51:26.202565] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:37.634 [2024-07-12 13:51:26.202595] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:37.634 [2024-07-12 13:51:26.206545] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1902010 00:25:37.634 [2024-07-12 13:51:26.208917] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:37.893 13:51:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:38.830 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:38.830 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:38.830 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:38.830 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:38.830 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:38.830 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.830 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.089 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:39.089 "name": "raid_bdev1", 00:25:39.090 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:39.090 "strip_size_kb": 0, 00:25:39.090 "state": "online", 00:25:39.090 "raid_level": "raid1", 00:25:39.090 "superblock": true, 00:25:39.090 "num_base_bdevs": 4, 00:25:39.090 "num_base_bdevs_discovered": 3, 00:25:39.090 "num_base_bdevs_operational": 3, 00:25:39.090 "process": { 00:25:39.090 "type": "rebuild", 00:25:39.090 "target": "spare", 00:25:39.090 "progress": { 00:25:39.090 "blocks": 24576, 00:25:39.090 "percent": 38 00:25:39.090 } 00:25:39.090 }, 00:25:39.090 "base_bdevs_list": [ 00:25:39.090 { 00:25:39.090 "name": "spare", 00:25:39.090 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:39.090 "is_configured": true, 00:25:39.090 "data_offset": 2048, 00:25:39.090 "data_size": 63488 00:25:39.090 }, 00:25:39.090 { 00:25:39.090 "name": null, 00:25:39.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.090 "is_configured": false, 00:25:39.090 "data_offset": 2048, 00:25:39.090 "data_size": 63488 00:25:39.090 }, 00:25:39.090 { 00:25:39.090 "name": "BaseBdev3", 00:25:39.090 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:39.090 "is_configured": true, 00:25:39.090 "data_offset": 2048, 00:25:39.090 "data_size": 63488 00:25:39.090 }, 00:25:39.090 { 00:25:39.090 "name": "BaseBdev4", 00:25:39.090 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:39.090 "is_configured": true, 00:25:39.090 "data_offset": 2048, 00:25:39.090 "data_size": 63488 00:25:39.090 } 00:25:39.090 ] 00:25:39.090 }' 00:25:39.090 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:39.090 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:39.090 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:39.090 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:39.090 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:39.350 [2024-07-12 13:51:27.803750] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:39.351 [2024-07-12 13:51:27.821221] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:39.351 [2024-07-12 13:51:27.821264] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:39.351 [2024-07-12 13:51:27.821280] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:39.351 [2024-07-12 13:51:27.821288] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.351 13:51:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.611 13:51:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:39.611 "name": "raid_bdev1", 00:25:39.611 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:39.611 "strip_size_kb": 0, 00:25:39.611 "state": "online", 00:25:39.611 "raid_level": "raid1", 00:25:39.611 "superblock": true, 00:25:39.611 "num_base_bdevs": 4, 00:25:39.611 "num_base_bdevs_discovered": 2, 00:25:39.611 "num_base_bdevs_operational": 2, 00:25:39.611 "base_bdevs_list": [ 00:25:39.611 { 00:25:39.611 "name": null, 00:25:39.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.611 "is_configured": false, 00:25:39.611 "data_offset": 2048, 00:25:39.611 "data_size": 63488 00:25:39.611 }, 00:25:39.611 { 00:25:39.611 "name": null, 00:25:39.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.611 "is_configured": false, 00:25:39.611 "data_offset": 2048, 00:25:39.611 "data_size": 63488 00:25:39.611 }, 00:25:39.611 { 00:25:39.611 "name": "BaseBdev3", 00:25:39.611 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:39.611 "is_configured": true, 00:25:39.611 "data_offset": 2048, 00:25:39.611 "data_size": 63488 00:25:39.611 }, 00:25:39.611 { 00:25:39.611 "name": "BaseBdev4", 00:25:39.611 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:39.611 "is_configured": true, 00:25:39.611 "data_offset": 2048, 00:25:39.611 "data_size": 63488 00:25:39.611 } 00:25:39.611 ] 00:25:39.611 }' 00:25:39.611 13:51:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:39.611 13:51:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:40.181 13:51:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:40.440 [2024-07-12 13:51:28.924780] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:40.440 [2024-07-12 13:51:28.924838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:40.440 [2024-07-12 13:51:28.924861] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1886950 00:25:40.440 [2024-07-12 13:51:28.924874] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:40.440 [2024-07-12 13:51:28.925270] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:40.440 [2024-07-12 13:51:28.925288] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:40.440 [2024-07-12 13:51:28.925371] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:40.440 [2024-07-12 13:51:28.925383] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:40.440 [2024-07-12 13:51:28.925394] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:40.440 [2024-07-12 13:51:28.925414] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:40.441 [2024-07-12 13:51:28.929387] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1458fa0 00:25:40.441 spare 00:25:40.441 [2024-07-12 13:51:28.930782] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:40.441 13:51:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:41.381 13:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:41.381 13:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:41.381 13:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:41.381 13:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:41.381 13:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:41.381 13:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.381 13:51:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.641 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:41.641 "name": "raid_bdev1", 00:25:41.641 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:41.641 "strip_size_kb": 0, 00:25:41.641 "state": "online", 00:25:41.641 "raid_level": "raid1", 00:25:41.641 "superblock": true, 00:25:41.641 "num_base_bdevs": 4, 00:25:41.641 "num_base_bdevs_discovered": 3, 00:25:41.641 "num_base_bdevs_operational": 3, 00:25:41.641 "process": { 00:25:41.641 "type": "rebuild", 00:25:41.641 "target": "spare", 00:25:41.641 "progress": { 00:25:41.641 "blocks": 24576, 00:25:41.641 "percent": 38 00:25:41.641 } 00:25:41.641 }, 00:25:41.641 "base_bdevs_list": [ 00:25:41.641 { 00:25:41.641 "name": "spare", 00:25:41.641 "uuid": "604b177c-8ac7-5a3a-b985-92b366fbc699", 00:25:41.641 "is_configured": true, 00:25:41.641 "data_offset": 2048, 00:25:41.641 "data_size": 63488 00:25:41.641 }, 00:25:41.641 { 00:25:41.641 "name": null, 00:25:41.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:41.641 "is_configured": false, 00:25:41.641 "data_offset": 2048, 00:25:41.641 "data_size": 63488 00:25:41.641 }, 00:25:41.641 { 00:25:41.641 "name": "BaseBdev3", 00:25:41.641 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:41.641 "is_configured": true, 00:25:41.641 "data_offset": 2048, 00:25:41.641 "data_size": 63488 00:25:41.641 }, 00:25:41.641 { 00:25:41.641 "name": "BaseBdev4", 00:25:41.641 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:41.641 "is_configured": true, 00:25:41.641 "data_offset": 2048, 00:25:41.641 "data_size": 63488 00:25:41.641 } 00:25:41.641 ] 00:25:41.641 }' 00:25:41.641 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:41.901 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:41.901 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:41.901 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:41.901 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:42.161 [2024-07-12 13:51:30.518693] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:42.161 [2024-07-12 13:51:30.543201] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:42.161 [2024-07-12 13:51:30.543245] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:42.161 [2024-07-12 13:51:30.543262] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:42.161 [2024-07-12 13:51:30.543270] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.161 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.421 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:42.421 "name": "raid_bdev1", 00:25:42.421 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:42.421 "strip_size_kb": 0, 00:25:42.421 "state": "online", 00:25:42.421 "raid_level": "raid1", 00:25:42.421 "superblock": true, 00:25:42.421 "num_base_bdevs": 4, 00:25:42.421 "num_base_bdevs_discovered": 2, 00:25:42.421 "num_base_bdevs_operational": 2, 00:25:42.421 "base_bdevs_list": [ 00:25:42.421 { 00:25:42.421 "name": null, 00:25:42.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.421 "is_configured": false, 00:25:42.421 "data_offset": 2048, 00:25:42.421 "data_size": 63488 00:25:42.421 }, 00:25:42.421 { 00:25:42.421 "name": null, 00:25:42.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.421 "is_configured": false, 00:25:42.421 "data_offset": 2048, 00:25:42.421 "data_size": 63488 00:25:42.421 }, 00:25:42.421 { 00:25:42.421 "name": "BaseBdev3", 00:25:42.421 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:42.421 "is_configured": true, 00:25:42.421 "data_offset": 2048, 00:25:42.421 "data_size": 63488 00:25:42.421 }, 00:25:42.421 { 00:25:42.421 "name": "BaseBdev4", 00:25:42.421 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:42.421 "is_configured": true, 00:25:42.421 "data_offset": 2048, 00:25:42.421 "data_size": 63488 00:25:42.421 } 00:25:42.421 ] 00:25:42.421 }' 00:25:42.421 13:51:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:42.421 13:51:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:43.361 "name": "raid_bdev1", 00:25:43.361 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:43.361 "strip_size_kb": 0, 00:25:43.361 "state": "online", 00:25:43.361 "raid_level": "raid1", 00:25:43.361 "superblock": true, 00:25:43.361 "num_base_bdevs": 4, 00:25:43.361 "num_base_bdevs_discovered": 2, 00:25:43.361 "num_base_bdevs_operational": 2, 00:25:43.361 "base_bdevs_list": [ 00:25:43.361 { 00:25:43.361 "name": null, 00:25:43.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.361 "is_configured": false, 00:25:43.361 "data_offset": 2048, 00:25:43.361 "data_size": 63488 00:25:43.361 }, 00:25:43.361 { 00:25:43.361 "name": null, 00:25:43.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.361 "is_configured": false, 00:25:43.361 "data_offset": 2048, 00:25:43.361 "data_size": 63488 00:25:43.361 }, 00:25:43.361 { 00:25:43.361 "name": "BaseBdev3", 00:25:43.361 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:43.361 "is_configured": true, 00:25:43.361 "data_offset": 2048, 00:25:43.361 "data_size": 63488 00:25:43.361 }, 00:25:43.361 { 00:25:43.361 "name": "BaseBdev4", 00:25:43.361 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:43.361 "is_configured": true, 00:25:43.361 "data_offset": 2048, 00:25:43.361 "data_size": 63488 00:25:43.361 } 00:25:43.361 ] 00:25:43.361 }' 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:43.361 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:43.621 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:43.621 13:51:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:43.881 13:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:43.881 [2024-07-12 13:51:32.436555] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:43.881 [2024-07-12 13:51:32.436611] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:43.881 [2024-07-12 13:51:32.436631] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1882a90 00:25:43.881 [2024-07-12 13:51:32.436644] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:43.881 [2024-07-12 13:51:32.437028] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:43.881 [2024-07-12 13:51:32.437047] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:43.881 [2024-07-12 13:51:32.437117] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:43.881 [2024-07-12 13:51:32.437129] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:43.881 [2024-07-12 13:51:32.437140] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:43.881 BaseBdev1 00:25:43.881 13:51:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.263 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:45.263 "name": "raid_bdev1", 00:25:45.263 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:45.263 "strip_size_kb": 0, 00:25:45.263 "state": "online", 00:25:45.263 "raid_level": "raid1", 00:25:45.263 "superblock": true, 00:25:45.263 "num_base_bdevs": 4, 00:25:45.263 "num_base_bdevs_discovered": 2, 00:25:45.263 "num_base_bdevs_operational": 2, 00:25:45.263 "base_bdevs_list": [ 00:25:45.263 { 00:25:45.263 "name": null, 00:25:45.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.263 "is_configured": false, 00:25:45.263 "data_offset": 2048, 00:25:45.263 "data_size": 63488 00:25:45.263 }, 00:25:45.263 { 00:25:45.263 "name": null, 00:25:45.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.263 "is_configured": false, 00:25:45.263 "data_offset": 2048, 00:25:45.263 "data_size": 63488 00:25:45.263 }, 00:25:45.263 { 00:25:45.264 "name": "BaseBdev3", 00:25:45.264 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:45.264 "is_configured": true, 00:25:45.264 "data_offset": 2048, 00:25:45.264 "data_size": 63488 00:25:45.264 }, 00:25:45.264 { 00:25:45.264 "name": "BaseBdev4", 00:25:45.264 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:45.264 "is_configured": true, 00:25:45.264 "data_offset": 2048, 00:25:45.264 "data_size": 63488 00:25:45.264 } 00:25:45.264 ] 00:25:45.264 }' 00:25:45.264 13:51:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:45.264 13:51:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:45.833 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:45.833 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:45.833 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:45.833 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:45.833 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:45.833 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.833 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:46.092 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:46.092 "name": "raid_bdev1", 00:25:46.092 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:46.092 "strip_size_kb": 0, 00:25:46.092 "state": "online", 00:25:46.092 "raid_level": "raid1", 00:25:46.092 "superblock": true, 00:25:46.092 "num_base_bdevs": 4, 00:25:46.092 "num_base_bdevs_discovered": 2, 00:25:46.092 "num_base_bdevs_operational": 2, 00:25:46.093 "base_bdevs_list": [ 00:25:46.093 { 00:25:46.093 "name": null, 00:25:46.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.093 "is_configured": false, 00:25:46.093 "data_offset": 2048, 00:25:46.093 "data_size": 63488 00:25:46.093 }, 00:25:46.093 { 00:25:46.093 "name": null, 00:25:46.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.093 "is_configured": false, 00:25:46.093 "data_offset": 2048, 00:25:46.093 "data_size": 63488 00:25:46.093 }, 00:25:46.093 { 00:25:46.093 "name": "BaseBdev3", 00:25:46.093 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:46.093 "is_configured": true, 00:25:46.093 "data_offset": 2048, 00:25:46.093 "data_size": 63488 00:25:46.093 }, 00:25:46.093 { 00:25:46.093 "name": "BaseBdev4", 00:25:46.093 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:46.093 "is_configured": true, 00:25:46.093 "data_offset": 2048, 00:25:46.093 "data_size": 63488 00:25:46.093 } 00:25:46.093 ] 00:25:46.093 }' 00:25:46.093 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:46.093 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:46.093 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:46.352 13:51:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:46.921 [2024-07-12 13:51:35.219945] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:46.921 [2024-07-12 13:51:35.220078] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:46.921 [2024-07-12 13:51:35.220093] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:46.921 request: 00:25:46.921 { 00:25:46.921 "base_bdev": "BaseBdev1", 00:25:46.921 "raid_bdev": "raid_bdev1", 00:25:46.921 "method": "bdev_raid_add_base_bdev", 00:25:46.921 "req_id": 1 00:25:46.921 } 00:25:46.921 Got JSON-RPC error response 00:25:46.921 response: 00:25:46.921 { 00:25:46.921 "code": -22, 00:25:46.921 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:46.921 } 00:25:46.921 13:51:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:25:46.921 13:51:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:46.921 13:51:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:46.921 13:51:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:46.922 13:51:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.860 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.428 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:48.428 "name": "raid_bdev1", 00:25:48.428 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:48.428 "strip_size_kb": 0, 00:25:48.428 "state": "online", 00:25:48.428 "raid_level": "raid1", 00:25:48.428 "superblock": true, 00:25:48.428 "num_base_bdevs": 4, 00:25:48.428 "num_base_bdevs_discovered": 2, 00:25:48.428 "num_base_bdevs_operational": 2, 00:25:48.428 "base_bdevs_list": [ 00:25:48.428 { 00:25:48.429 "name": null, 00:25:48.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.429 "is_configured": false, 00:25:48.429 "data_offset": 2048, 00:25:48.429 "data_size": 63488 00:25:48.429 }, 00:25:48.429 { 00:25:48.429 "name": null, 00:25:48.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.429 "is_configured": false, 00:25:48.429 "data_offset": 2048, 00:25:48.429 "data_size": 63488 00:25:48.429 }, 00:25:48.429 { 00:25:48.429 "name": "BaseBdev3", 00:25:48.429 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:48.429 "is_configured": true, 00:25:48.429 "data_offset": 2048, 00:25:48.429 "data_size": 63488 00:25:48.429 }, 00:25:48.429 { 00:25:48.429 "name": "BaseBdev4", 00:25:48.429 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:48.429 "is_configured": true, 00:25:48.429 "data_offset": 2048, 00:25:48.429 "data_size": 63488 00:25:48.429 } 00:25:48.429 ] 00:25:48.429 }' 00:25:48.429 13:51:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:48.429 13:51:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:48.996 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:48.996 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:48.996 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:48.996 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:48.996 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:48.996 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.996 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.256 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:49.256 "name": "raid_bdev1", 00:25:49.256 "uuid": "1fa9071f-9cfb-457c-8282-7105938fd922", 00:25:49.256 "strip_size_kb": 0, 00:25:49.256 "state": "online", 00:25:49.256 "raid_level": "raid1", 00:25:49.256 "superblock": true, 00:25:49.256 "num_base_bdevs": 4, 00:25:49.256 "num_base_bdevs_discovered": 2, 00:25:49.256 "num_base_bdevs_operational": 2, 00:25:49.256 "base_bdevs_list": [ 00:25:49.256 { 00:25:49.256 "name": null, 00:25:49.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.256 "is_configured": false, 00:25:49.256 "data_offset": 2048, 00:25:49.256 "data_size": 63488 00:25:49.256 }, 00:25:49.256 { 00:25:49.256 "name": null, 00:25:49.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.256 "is_configured": false, 00:25:49.256 "data_offset": 2048, 00:25:49.256 "data_size": 63488 00:25:49.256 }, 00:25:49.256 { 00:25:49.256 "name": "BaseBdev3", 00:25:49.256 "uuid": "43c783db-042a-5f29-9692-bea85654d1ea", 00:25:49.256 "is_configured": true, 00:25:49.256 "data_offset": 2048, 00:25:49.256 "data_size": 63488 00:25:49.256 }, 00:25:49.256 { 00:25:49.256 "name": "BaseBdev4", 00:25:49.256 "uuid": "b934afcf-5f5f-59d4-bb95-0a7875beddd5", 00:25:49.256 "is_configured": true, 00:25:49.256 "data_offset": 2048, 00:25:49.256 "data_size": 63488 00:25:49.256 } 00:25:49.256 ] 00:25:49.256 }' 00:25:49.256 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:49.256 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:49.256 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 557149 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 557149 ']' 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 557149 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 557149 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 557149' 00:25:49.515 killing process with pid 557149 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 557149 00:25:49.515 Received shutdown signal, test time was about 60.000000 seconds 00:25:49.515 00:25:49.515 Latency(us) 00:25:49.515 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:49.515 =================================================================================================================== 00:25:49.515 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:49.515 [2024-07-12 13:51:37.947355] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:49.515 [2024-07-12 13:51:37.947458] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:49.515 13:51:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 557149 00:25:49.515 [2024-07-12 13:51:37.947514] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:49.515 [2024-07-12 13:51:37.947527] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18864e0 name raid_bdev1, state offline 00:25:49.515 [2024-07-12 13:51:38.001867] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:25:49.774 00:25:49.774 real 0m39.428s 00:25:49.774 user 0m57.688s 00:25:49.774 sys 0m7.163s 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:49.774 ************************************ 00:25:49.774 END TEST raid_rebuild_test_sb 00:25:49.774 ************************************ 00:25:49.774 13:51:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:49.774 13:51:38 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:25:49.774 13:51:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:49.774 13:51:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:49.774 13:51:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:49.774 ************************************ 00:25:49.774 START TEST raid_rebuild_test_io 00:25:49.774 ************************************ 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=563191 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 563191 /var/tmp/spdk-raid.sock 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 563191 ']' 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:49.774 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:49.774 13:51:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:50.033 [2024-07-12 13:51:38.398044] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:25:50.033 [2024-07-12 13:51:38.398113] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid563191 ] 00:25:50.033 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:50.033 Zero copy mechanism will not be used. 00:25:50.033 [2024-07-12 13:51:38.528632] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:50.292 [2024-07-12 13:51:38.635075] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:50.292 [2024-07-12 13:51:38.702106] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:50.292 [2024-07-12 13:51:38.702150] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:50.860 13:51:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:50.860 13:51:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:25:50.860 13:51:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:50.860 13:51:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:51.426 BaseBdev1_malloc 00:25:51.426 13:51:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:51.994 [2024-07-12 13:51:40.325966] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:51.994 [2024-07-12 13:51:40.326022] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:51.994 [2024-07-12 13:51:40.326050] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23f6680 00:25:51.994 [2024-07-12 13:51:40.326064] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:51.994 [2024-07-12 13:51:40.327870] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:51.994 [2024-07-12 13:51:40.327898] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:51.994 BaseBdev1 00:25:51.994 13:51:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:51.994 13:51:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:52.563 BaseBdev2_malloc 00:25:52.563 13:51:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:52.563 [2024-07-12 13:51:41.101990] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:52.563 [2024-07-12 13:51:41.102037] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:52.563 [2024-07-12 13:51:41.102062] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23f71a0 00:25:52.563 [2024-07-12 13:51:41.102074] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:52.563 [2024-07-12 13:51:41.103634] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:52.563 [2024-07-12 13:51:41.103662] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:52.563 BaseBdev2 00:25:52.563 13:51:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:52.563 13:51:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:53.131 BaseBdev3_malloc 00:25:53.131 13:51:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:53.698 [2024-07-12 13:51:42.117231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:53.698 [2024-07-12 13:51:42.117278] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:53.698 [2024-07-12 13:51:42.117301] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a4230 00:25:53.698 [2024-07-12 13:51:42.117314] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:53.698 [2024-07-12 13:51:42.118915] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:53.698 [2024-07-12 13:51:42.118953] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:53.698 BaseBdev3 00:25:53.698 13:51:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:53.698 13:51:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:54.266 BaseBdev4_malloc 00:25:54.266 13:51:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:54.840 [2024-07-12 13:51:43.145744] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:54.840 [2024-07-12 13:51:43.145795] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:54.840 [2024-07-12 13:51:43.145817] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a3410 00:25:54.840 [2024-07-12 13:51:43.145830] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:54.840 [2024-07-12 13:51:43.147388] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:54.840 [2024-07-12 13:51:43.147417] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:54.840 BaseBdev4 00:25:54.840 13:51:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:55.099 spare_malloc 00:25:55.358 13:51:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:55.618 spare_delay 00:25:55.618 13:51:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:56.186 [2024-07-12 13:51:44.678318] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:56.186 [2024-07-12 13:51:44.678368] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:56.186 [2024-07-12 13:51:44.678392] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a7ef0 00:25:56.186 [2024-07-12 13:51:44.678405] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:56.186 [2024-07-12 13:51:44.680054] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:56.186 [2024-07-12 13:51:44.680083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:56.186 spare 00:25:56.186 13:51:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:56.755 [2024-07-12 13:51:45.191683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:56.755 [2024-07-12 13:51:45.193013] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:56.755 [2024-07-12 13:51:45.193069] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:56.755 [2024-07-12 13:51:45.193114] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:56.756 [2024-07-12 13:51:45.193197] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25271e0 00:25:56.756 [2024-07-12 13:51:45.193207] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:56.756 [2024-07-12 13:51:45.193422] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25a1750 00:25:56.756 [2024-07-12 13:51:45.193571] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25271e0 00:25:56.756 [2024-07-12 13:51:45.193581] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25271e0 00:25:56.756 [2024-07-12 13:51:45.193699] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.756 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.325 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:57.325 "name": "raid_bdev1", 00:25:57.325 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:25:57.325 "strip_size_kb": 0, 00:25:57.325 "state": "online", 00:25:57.325 "raid_level": "raid1", 00:25:57.325 "superblock": false, 00:25:57.325 "num_base_bdevs": 4, 00:25:57.325 "num_base_bdevs_discovered": 4, 00:25:57.325 "num_base_bdevs_operational": 4, 00:25:57.325 "base_bdevs_list": [ 00:25:57.325 { 00:25:57.325 "name": "BaseBdev1", 00:25:57.325 "uuid": "9595095a-b147-59f0-8531-2e28bd578137", 00:25:57.325 "is_configured": true, 00:25:57.325 "data_offset": 0, 00:25:57.325 "data_size": 65536 00:25:57.325 }, 00:25:57.325 { 00:25:57.325 "name": "BaseBdev2", 00:25:57.325 "uuid": "70b0f9ef-9ac3-5d84-88b1-b80326efb934", 00:25:57.325 "is_configured": true, 00:25:57.325 "data_offset": 0, 00:25:57.325 "data_size": 65536 00:25:57.325 }, 00:25:57.325 { 00:25:57.325 "name": "BaseBdev3", 00:25:57.325 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:25:57.325 "is_configured": true, 00:25:57.325 "data_offset": 0, 00:25:57.325 "data_size": 65536 00:25:57.325 }, 00:25:57.325 { 00:25:57.325 "name": "BaseBdev4", 00:25:57.325 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:25:57.325 "is_configured": true, 00:25:57.325 "data_offset": 0, 00:25:57.325 "data_size": 65536 00:25:57.325 } 00:25:57.325 ] 00:25:57.325 }' 00:25:57.325 13:51:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:57.325 13:51:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:57.894 13:51:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:57.894 13:51:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:58.463 [2024-07-12 13:51:46.832351] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:58.463 13:51:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:58.463 13:51:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.463 13:51:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:58.722 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:58.722 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:58.722 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:58.722 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:58.981 [2024-07-12 13:51:47.331447] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x252d2b0 00:25:58.981 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:58.981 Zero copy mechanism will not be used. 00:25:58.981 Running I/O for 60 seconds... 00:25:59.241 [2024-07-12 13:51:47.599302] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:59.241 [2024-07-12 13:51:47.599484] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x252d2b0 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.241 13:51:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.810 13:51:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:59.810 "name": "raid_bdev1", 00:25:59.810 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:25:59.810 "strip_size_kb": 0, 00:25:59.810 "state": "online", 00:25:59.810 "raid_level": "raid1", 00:25:59.810 "superblock": false, 00:25:59.810 "num_base_bdevs": 4, 00:25:59.810 "num_base_bdevs_discovered": 3, 00:25:59.810 "num_base_bdevs_operational": 3, 00:25:59.810 "base_bdevs_list": [ 00:25:59.810 { 00:25:59.810 "name": null, 00:25:59.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.810 "is_configured": false, 00:25:59.810 "data_offset": 0, 00:25:59.810 "data_size": 65536 00:25:59.810 }, 00:25:59.810 { 00:25:59.810 "name": "BaseBdev2", 00:25:59.810 "uuid": "70b0f9ef-9ac3-5d84-88b1-b80326efb934", 00:25:59.810 "is_configured": true, 00:25:59.810 "data_offset": 0, 00:25:59.810 "data_size": 65536 00:25:59.810 }, 00:25:59.810 { 00:25:59.810 "name": "BaseBdev3", 00:25:59.810 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:25:59.810 "is_configured": true, 00:25:59.810 "data_offset": 0, 00:25:59.810 "data_size": 65536 00:25:59.810 }, 00:25:59.810 { 00:25:59.810 "name": "BaseBdev4", 00:25:59.810 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:25:59.810 "is_configured": true, 00:25:59.810 "data_offset": 0, 00:25:59.810 "data_size": 65536 00:25:59.810 } 00:25:59.810 ] 00:25:59.810 }' 00:25:59.810 13:51:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:59.810 13:51:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:00.749 13:51:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:00.749 [2024-07-12 13:51:49.321273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:01.008 13:51:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:01.008 [2024-07-12 13:51:49.386413] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20fdfa0 00:26:01.008 [2024-07-12 13:51:49.388840] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:01.008 [2024-07-12 13:51:49.519243] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:01.008 [2024-07-12 13:51:49.519637] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:01.267 [2024-07-12 13:51:49.742935] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:01.267 [2024-07-12 13:51:49.743567] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:01.530 [2024-07-12 13:51:50.107895] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:01.789 [2024-07-12 13:51:50.230954] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:01.789 [2024-07-12 13:51:50.231568] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:02.057 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:02.057 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:02.057 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:02.057 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:02.057 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:02.057 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.057 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.322 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:02.322 "name": "raid_bdev1", 00:26:02.322 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:26:02.322 "strip_size_kb": 0, 00:26:02.322 "state": "online", 00:26:02.322 "raid_level": "raid1", 00:26:02.322 "superblock": false, 00:26:02.322 "num_base_bdevs": 4, 00:26:02.322 "num_base_bdevs_discovered": 4, 00:26:02.322 "num_base_bdevs_operational": 4, 00:26:02.322 "process": { 00:26:02.322 "type": "rebuild", 00:26:02.322 "target": "spare", 00:26:02.322 "progress": { 00:26:02.322 "blocks": 14336, 00:26:02.322 "percent": 21 00:26:02.322 } 00:26:02.322 }, 00:26:02.322 "base_bdevs_list": [ 00:26:02.322 { 00:26:02.322 "name": "spare", 00:26:02.322 "uuid": "7cb04633-a472-586d-875a-2775b9b5f82c", 00:26:02.322 "is_configured": true, 00:26:02.322 "data_offset": 0, 00:26:02.322 "data_size": 65536 00:26:02.322 }, 00:26:02.322 { 00:26:02.322 "name": "BaseBdev2", 00:26:02.322 "uuid": "70b0f9ef-9ac3-5d84-88b1-b80326efb934", 00:26:02.322 "is_configured": true, 00:26:02.322 "data_offset": 0, 00:26:02.322 "data_size": 65536 00:26:02.322 }, 00:26:02.322 { 00:26:02.322 "name": "BaseBdev3", 00:26:02.322 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:26:02.322 "is_configured": true, 00:26:02.322 "data_offset": 0, 00:26:02.322 "data_size": 65536 00:26:02.322 }, 00:26:02.322 { 00:26:02.322 "name": "BaseBdev4", 00:26:02.322 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:26:02.322 "is_configured": true, 00:26:02.322 "data_offset": 0, 00:26:02.322 "data_size": 65536 00:26:02.322 } 00:26:02.322 ] 00:26:02.322 }' 00:26:02.322 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:02.322 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:02.322 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:02.322 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:02.322 13:51:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:02.581 [2024-07-12 13:51:51.054487] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:02.840 [2024-07-12 13:51:51.256208] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:02.840 [2024-07-12 13:51:51.297020] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:02.840 [2024-07-12 13:51:51.297197] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:02.840 [2024-07-12 13:51:51.408166] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:03.098 [2024-07-12 13:51:51.430650] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:03.098 [2024-07-12 13:51:51.430685] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:03.098 [2024-07-12 13:51:51.430695] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:03.098 [2024-07-12 13:51:51.470085] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x252d2b0 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.098 13:51:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.665 13:51:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:03.665 "name": "raid_bdev1", 00:26:03.665 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:26:03.665 "strip_size_kb": 0, 00:26:03.665 "state": "online", 00:26:03.665 "raid_level": "raid1", 00:26:03.665 "superblock": false, 00:26:03.665 "num_base_bdevs": 4, 00:26:03.665 "num_base_bdevs_discovered": 3, 00:26:03.665 "num_base_bdevs_operational": 3, 00:26:03.665 "base_bdevs_list": [ 00:26:03.665 { 00:26:03.665 "name": null, 00:26:03.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:03.665 "is_configured": false, 00:26:03.665 "data_offset": 0, 00:26:03.665 "data_size": 65536 00:26:03.665 }, 00:26:03.665 { 00:26:03.665 "name": "BaseBdev2", 00:26:03.665 "uuid": "70b0f9ef-9ac3-5d84-88b1-b80326efb934", 00:26:03.665 "is_configured": true, 00:26:03.665 "data_offset": 0, 00:26:03.665 "data_size": 65536 00:26:03.665 }, 00:26:03.665 { 00:26:03.665 "name": "BaseBdev3", 00:26:03.665 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:26:03.665 "is_configured": true, 00:26:03.665 "data_offset": 0, 00:26:03.665 "data_size": 65536 00:26:03.665 }, 00:26:03.665 { 00:26:03.665 "name": "BaseBdev4", 00:26:03.665 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:26:03.665 "is_configured": true, 00:26:03.665 "data_offset": 0, 00:26:03.665 "data_size": 65536 00:26:03.665 } 00:26:03.665 ] 00:26:03.665 }' 00:26:03.665 13:51:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:03.665 13:51:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:04.610 13:51:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:04.610 13:51:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.610 13:51:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:04.610 13:51:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:04.610 13:51:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.610 13:51:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.610 13:51:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.871 13:51:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:04.871 "name": "raid_bdev1", 00:26:04.871 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:26:04.871 "strip_size_kb": 0, 00:26:04.871 "state": "online", 00:26:04.871 "raid_level": "raid1", 00:26:04.871 "superblock": false, 00:26:04.871 "num_base_bdevs": 4, 00:26:04.871 "num_base_bdevs_discovered": 3, 00:26:04.871 "num_base_bdevs_operational": 3, 00:26:04.871 "base_bdevs_list": [ 00:26:04.871 { 00:26:04.871 "name": null, 00:26:04.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.871 "is_configured": false, 00:26:04.871 "data_offset": 0, 00:26:04.871 "data_size": 65536 00:26:04.871 }, 00:26:04.871 { 00:26:04.871 "name": "BaseBdev2", 00:26:04.871 "uuid": "70b0f9ef-9ac3-5d84-88b1-b80326efb934", 00:26:04.871 "is_configured": true, 00:26:04.871 "data_offset": 0, 00:26:04.871 "data_size": 65536 00:26:04.871 }, 00:26:04.871 { 00:26:04.871 "name": "BaseBdev3", 00:26:04.871 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:26:04.871 "is_configured": true, 00:26:04.871 "data_offset": 0, 00:26:04.871 "data_size": 65536 00:26:04.871 }, 00:26:04.871 { 00:26:04.871 "name": "BaseBdev4", 00:26:04.871 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:26:04.871 "is_configured": true, 00:26:04.871 "data_offset": 0, 00:26:04.871 "data_size": 65536 00:26:04.871 } 00:26:04.871 ] 00:26:04.871 }' 00:26:04.871 13:51:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:04.871 13:51:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:04.871 13:51:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:04.871 13:51:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:04.871 13:51:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:05.438 [2024-07-12 13:51:53.818385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:05.438 13:51:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:05.438 [2024-07-12 13:51:53.892239] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259b230 00:26:05.438 [2024-07-12 13:51:53.893765] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:05.695 [2024-07-12 13:51:54.025031] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:05.695 [2024-07-12 13:51:54.025360] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:05.695 [2024-07-12 13:51:54.230037] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:05.695 [2024-07-12 13:51:54.230572] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:06.262 [2024-07-12 13:51:54.711057] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:06.521 13:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:06.521 13:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:06.521 13:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:06.521 13:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:06.521 13:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:06.521 13:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.521 13:51:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.521 [2024-07-12 13:51:55.073281] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:06.521 [2024-07-12 13:51:55.073755] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:06.780 [2024-07-12 13:51:55.297811] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:06.780 [2024-07-12 13:51:55.298532] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:07.039 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.039 "name": "raid_bdev1", 00:26:07.039 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:26:07.039 "strip_size_kb": 0, 00:26:07.039 "state": "online", 00:26:07.039 "raid_level": "raid1", 00:26:07.039 "superblock": false, 00:26:07.039 "num_base_bdevs": 4, 00:26:07.039 "num_base_bdevs_discovered": 4, 00:26:07.039 "num_base_bdevs_operational": 4, 00:26:07.039 "process": { 00:26:07.039 "type": "rebuild", 00:26:07.039 "target": "spare", 00:26:07.039 "progress": { 00:26:07.039 "blocks": 16384, 00:26:07.039 "percent": 25 00:26:07.039 } 00:26:07.039 }, 00:26:07.039 "base_bdevs_list": [ 00:26:07.039 { 00:26:07.039 "name": "spare", 00:26:07.039 "uuid": "7cb04633-a472-586d-875a-2775b9b5f82c", 00:26:07.039 "is_configured": true, 00:26:07.039 "data_offset": 0, 00:26:07.039 "data_size": 65536 00:26:07.039 }, 00:26:07.039 { 00:26:07.039 "name": "BaseBdev2", 00:26:07.039 "uuid": "70b0f9ef-9ac3-5d84-88b1-b80326efb934", 00:26:07.039 "is_configured": true, 00:26:07.039 "data_offset": 0, 00:26:07.039 "data_size": 65536 00:26:07.039 }, 00:26:07.039 { 00:26:07.039 "name": "BaseBdev3", 00:26:07.039 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:26:07.039 "is_configured": true, 00:26:07.039 "data_offset": 0, 00:26:07.039 "data_size": 65536 00:26:07.039 }, 00:26:07.039 { 00:26:07.039 "name": "BaseBdev4", 00:26:07.039 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:26:07.039 "is_configured": true, 00:26:07.039 "data_offset": 0, 00:26:07.039 "data_size": 65536 00:26:07.039 } 00:26:07.039 ] 00:26:07.039 }' 00:26:07.039 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.039 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:07.039 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.039 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:07.039 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:26:07.039 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:07.039 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:07.039 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:07.039 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:07.299 [2024-07-12 13:51:55.635975] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:07.299 [2024-07-12 13:51:55.737361] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:07.299 [2024-07-12 13:51:55.748240] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:07.559 [2024-07-12 13:51:55.899172] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x252d2b0 00:26:07.559 [2024-07-12 13:51:55.899210] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x259b230 00:26:07.559 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:07.559 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:07.559 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:07.559 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.559 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:07.559 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:07.559 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.559 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.559 13:51:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.559 [2024-07-12 13:51:56.040141] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:07.818 [2024-07-12 13:51:56.173047] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:08.078 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:08.079 "name": "raid_bdev1", 00:26:08.079 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:26:08.079 "strip_size_kb": 0, 00:26:08.079 "state": "online", 00:26:08.079 "raid_level": "raid1", 00:26:08.079 "superblock": false, 00:26:08.079 "num_base_bdevs": 4, 00:26:08.079 "num_base_bdevs_discovered": 3, 00:26:08.079 "num_base_bdevs_operational": 3, 00:26:08.079 "process": { 00:26:08.079 "type": "rebuild", 00:26:08.079 "target": "spare", 00:26:08.079 "progress": { 00:26:08.079 "blocks": 32768, 00:26:08.079 "percent": 50 00:26:08.079 } 00:26:08.079 }, 00:26:08.079 "base_bdevs_list": [ 00:26:08.079 { 00:26:08.079 "name": "spare", 00:26:08.079 "uuid": "7cb04633-a472-586d-875a-2775b9b5f82c", 00:26:08.079 "is_configured": true, 00:26:08.079 "data_offset": 0, 00:26:08.079 "data_size": 65536 00:26:08.079 }, 00:26:08.079 { 00:26:08.079 "name": null, 00:26:08.079 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.079 "is_configured": false, 00:26:08.079 "data_offset": 0, 00:26:08.079 "data_size": 65536 00:26:08.079 }, 00:26:08.079 { 00:26:08.079 "name": "BaseBdev3", 00:26:08.079 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:26:08.079 "is_configured": true, 00:26:08.079 "data_offset": 0, 00:26:08.079 "data_size": 65536 00:26:08.079 }, 00:26:08.079 { 00:26:08.079 "name": "BaseBdev4", 00:26:08.079 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:26:08.079 "is_configured": true, 00:26:08.079 "data_offset": 0, 00:26:08.079 "data_size": 65536 00:26:08.079 } 00:26:08.079 ] 00:26:08.079 }' 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:08.079 [2024-07-12 13:51:56.516239] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=980 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.079 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.338 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:08.338 "name": "raid_bdev1", 00:26:08.338 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:26:08.338 "strip_size_kb": 0, 00:26:08.338 "state": "online", 00:26:08.338 "raid_level": "raid1", 00:26:08.338 "superblock": false, 00:26:08.338 "num_base_bdevs": 4, 00:26:08.338 "num_base_bdevs_discovered": 3, 00:26:08.338 "num_base_bdevs_operational": 3, 00:26:08.338 "process": { 00:26:08.338 "type": "rebuild", 00:26:08.338 "target": "spare", 00:26:08.338 "progress": { 00:26:08.338 "blocks": 38912, 00:26:08.338 "percent": 59 00:26:08.338 } 00:26:08.338 }, 00:26:08.338 "base_bdevs_list": [ 00:26:08.338 { 00:26:08.338 "name": "spare", 00:26:08.338 "uuid": "7cb04633-a472-586d-875a-2775b9b5f82c", 00:26:08.339 "is_configured": true, 00:26:08.339 "data_offset": 0, 00:26:08.339 "data_size": 65536 00:26:08.339 }, 00:26:08.339 { 00:26:08.339 "name": null, 00:26:08.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.339 "is_configured": false, 00:26:08.339 "data_offset": 0, 00:26:08.339 "data_size": 65536 00:26:08.339 }, 00:26:08.339 { 00:26:08.339 "name": "BaseBdev3", 00:26:08.339 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:26:08.339 "is_configured": true, 00:26:08.339 "data_offset": 0, 00:26:08.339 "data_size": 65536 00:26:08.339 }, 00:26:08.339 { 00:26:08.339 "name": "BaseBdev4", 00:26:08.339 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:26:08.339 "is_configured": true, 00:26:08.339 "data_offset": 0, 00:26:08.339 "data_size": 65536 00:26:08.339 } 00:26:08.339 ] 00:26:08.339 }' 00:26:08.339 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:08.339 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:08.339 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:08.339 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:08.339 13:51:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:09.720 13:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:09.720 13:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:09.720 13:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:09.720 13:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:09.720 13:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:09.720 13:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:09.720 13:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.720 13:51:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:09.720 13:51:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:09.720 "name": "raid_bdev1", 00:26:09.720 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:26:09.720 "strip_size_kb": 0, 00:26:09.720 "state": "online", 00:26:09.720 "raid_level": "raid1", 00:26:09.720 "superblock": false, 00:26:09.720 "num_base_bdevs": 4, 00:26:09.720 "num_base_bdevs_discovered": 3, 00:26:09.720 "num_base_bdevs_operational": 3, 00:26:09.720 "process": { 00:26:09.720 "type": "rebuild", 00:26:09.720 "target": "spare", 00:26:09.720 "progress": { 00:26:09.720 "blocks": 61440, 00:26:09.720 "percent": 93 00:26:09.720 } 00:26:09.720 }, 00:26:09.720 "base_bdevs_list": [ 00:26:09.720 { 00:26:09.720 "name": "spare", 00:26:09.720 "uuid": "7cb04633-a472-586d-875a-2775b9b5f82c", 00:26:09.720 "is_configured": true, 00:26:09.720 "data_offset": 0, 00:26:09.720 "data_size": 65536 00:26:09.720 }, 00:26:09.720 { 00:26:09.720 "name": null, 00:26:09.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:09.720 "is_configured": false, 00:26:09.720 "data_offset": 0, 00:26:09.720 "data_size": 65536 00:26:09.720 }, 00:26:09.720 { 00:26:09.720 "name": "BaseBdev3", 00:26:09.720 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:26:09.720 "is_configured": true, 00:26:09.720 "data_offset": 0, 00:26:09.720 "data_size": 65536 00:26:09.720 }, 00:26:09.720 { 00:26:09.720 "name": "BaseBdev4", 00:26:09.720 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:26:09.720 "is_configured": true, 00:26:09.720 "data_offset": 0, 00:26:09.720 "data_size": 65536 00:26:09.720 } 00:26:09.720 ] 00:26:09.720 }' 00:26:09.720 13:51:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:09.720 13:51:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:09.720 13:51:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:09.720 [2024-07-12 13:51:58.240025] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:09.720 13:51:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:09.720 13:51:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:09.979 [2024-07-12 13:51:58.307326] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:09.979 [2024-07-12 13:51:58.308614] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:10.919 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:10.919 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:10.919 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:10.919 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:10.919 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:10.919 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:10.919 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.919 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.490 "name": "raid_bdev1", 00:26:11.490 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:26:11.490 "strip_size_kb": 0, 00:26:11.490 "state": "online", 00:26:11.490 "raid_level": "raid1", 00:26:11.490 "superblock": false, 00:26:11.490 "num_base_bdevs": 4, 00:26:11.490 "num_base_bdevs_discovered": 3, 00:26:11.490 "num_base_bdevs_operational": 3, 00:26:11.490 "base_bdevs_list": [ 00:26:11.490 { 00:26:11.490 "name": "spare", 00:26:11.490 "uuid": "7cb04633-a472-586d-875a-2775b9b5f82c", 00:26:11.490 "is_configured": true, 00:26:11.490 "data_offset": 0, 00:26:11.490 "data_size": 65536 00:26:11.490 }, 00:26:11.490 { 00:26:11.490 "name": null, 00:26:11.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.490 "is_configured": false, 00:26:11.490 "data_offset": 0, 00:26:11.490 "data_size": 65536 00:26:11.490 }, 00:26:11.490 { 00:26:11.490 "name": "BaseBdev3", 00:26:11.490 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:26:11.490 "is_configured": true, 00:26:11.490 "data_offset": 0, 00:26:11.490 "data_size": 65536 00:26:11.490 }, 00:26:11.490 { 00:26:11.490 "name": "BaseBdev4", 00:26:11.490 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:26:11.490 "is_configured": true, 00:26:11.490 "data_offset": 0, 00:26:11.490 "data_size": 65536 00:26:11.490 } 00:26:11.490 ] 00:26:11.490 }' 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.490 13:51:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.750 "name": "raid_bdev1", 00:26:11.750 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:26:11.750 "strip_size_kb": 0, 00:26:11.750 "state": "online", 00:26:11.750 "raid_level": "raid1", 00:26:11.750 "superblock": false, 00:26:11.750 "num_base_bdevs": 4, 00:26:11.750 "num_base_bdevs_discovered": 3, 00:26:11.750 "num_base_bdevs_operational": 3, 00:26:11.750 "base_bdevs_list": [ 00:26:11.750 { 00:26:11.750 "name": "spare", 00:26:11.750 "uuid": "7cb04633-a472-586d-875a-2775b9b5f82c", 00:26:11.750 "is_configured": true, 00:26:11.750 "data_offset": 0, 00:26:11.750 "data_size": 65536 00:26:11.750 }, 00:26:11.750 { 00:26:11.750 "name": null, 00:26:11.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.750 "is_configured": false, 00:26:11.750 "data_offset": 0, 00:26:11.750 "data_size": 65536 00:26:11.750 }, 00:26:11.750 { 00:26:11.750 "name": "BaseBdev3", 00:26:11.750 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:26:11.750 "is_configured": true, 00:26:11.750 "data_offset": 0, 00:26:11.750 "data_size": 65536 00:26:11.750 }, 00:26:11.750 { 00:26:11.750 "name": "BaseBdev4", 00:26:11.750 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:26:11.750 "is_configured": true, 00:26:11.750 "data_offset": 0, 00:26:11.750 "data_size": 65536 00:26:11.750 } 00:26:11.750 ] 00:26:11.750 }' 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.750 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.009 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:12.010 "name": "raid_bdev1", 00:26:12.010 "uuid": "731a2499-8985-48ff-a7a3-8a2b8a5fe622", 00:26:12.010 "strip_size_kb": 0, 00:26:12.010 "state": "online", 00:26:12.010 "raid_level": "raid1", 00:26:12.010 "superblock": false, 00:26:12.010 "num_base_bdevs": 4, 00:26:12.010 "num_base_bdevs_discovered": 3, 00:26:12.010 "num_base_bdevs_operational": 3, 00:26:12.010 "base_bdevs_list": [ 00:26:12.010 { 00:26:12.010 "name": "spare", 00:26:12.010 "uuid": "7cb04633-a472-586d-875a-2775b9b5f82c", 00:26:12.010 "is_configured": true, 00:26:12.010 "data_offset": 0, 00:26:12.010 "data_size": 65536 00:26:12.010 }, 00:26:12.010 { 00:26:12.010 "name": null, 00:26:12.010 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.010 "is_configured": false, 00:26:12.010 "data_offset": 0, 00:26:12.010 "data_size": 65536 00:26:12.010 }, 00:26:12.010 { 00:26:12.010 "name": "BaseBdev3", 00:26:12.010 "uuid": "3f3f79b9-8b8e-5faa-857e-409652c3c6d9", 00:26:12.010 "is_configured": true, 00:26:12.010 "data_offset": 0, 00:26:12.010 "data_size": 65536 00:26:12.010 }, 00:26:12.010 { 00:26:12.010 "name": "BaseBdev4", 00:26:12.010 "uuid": "940a8897-9fb0-5f02-976d-ce2cba9ef97d", 00:26:12.010 "is_configured": true, 00:26:12.010 "data_offset": 0, 00:26:12.010 "data_size": 65536 00:26:12.010 } 00:26:12.010 ] 00:26:12.010 }' 00:26:12.010 13:52:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:12.010 13:52:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:12.577 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:12.836 [2024-07-12 13:52:01.338695] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:12.836 [2024-07-12 13:52:01.338725] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:13.095 00:26:13.095 Latency(us) 00:26:13.095 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:13.095 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:13.095 raid_bdev1 : 14.05 104.62 313.86 0.00 0.00 13079.54 306.31 122181.90 00:26:13.095 =================================================================================================================== 00:26:13.095 Total : 104.62 313.86 0.00 0.00 13079.54 306.31 122181.90 00:26:13.095 [2024-07-12 13:52:01.422935] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:13.095 [2024-07-12 13:52:01.422965] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:13.095 [2024-07-12 13:52:01.423058] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:13.095 [2024-07-12 13:52:01.423070] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25271e0 name raid_bdev1, state offline 00:26:13.095 0 00:26:13.095 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.095 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:13.354 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:13.354 /dev/nbd0 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:13.613 1+0 records in 00:26:13.613 1+0 records out 00:26:13.613 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277978 s, 14.7 MB/s 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:13.613 13:52:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:13.873 /dev/nbd1 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:13.873 1+0 records in 00:26:13.873 1+0 records out 00:26:13.873 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304729 s, 13.4 MB/s 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:13.873 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:14.132 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:14.392 /dev/nbd1 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:14.392 1+0 records in 00:26:14.392 1+0 records out 00:26:14.392 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247987 s, 16.5 MB/s 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:14.392 13:52:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:14.658 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 563191 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 563191 ']' 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 563191 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:26:14.931 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:15.245 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 563191 00:26:15.245 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:15.245 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:15.245 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 563191' 00:26:15.245 killing process with pid 563191 00:26:15.245 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 563191 00:26:15.245 Received shutdown signal, test time was about 16.174097 seconds 00:26:15.245 00:26:15.245 Latency(us) 00:26:15.245 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:15.245 =================================================================================================================== 00:26:15.245 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:15.245 [2024-07-12 13:52:03.547432] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:15.245 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 563191 00:26:15.245 [2024-07-12 13:52:03.590644] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:15.532 00:26:15.532 real 0m25.500s 00:26:15.532 user 0m41.629s 00:26:15.532 sys 0m4.295s 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:15.532 ************************************ 00:26:15.532 END TEST raid_rebuild_test_io 00:26:15.532 ************************************ 00:26:15.532 13:52:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:15.532 13:52:03 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:26:15.532 13:52:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:15.532 13:52:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:15.532 13:52:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:15.532 ************************************ 00:26:15.532 START TEST raid_rebuild_test_sb_io 00:26:15.532 ************************************ 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=566610 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 566610 /var/tmp/spdk-raid.sock 00:26:15.532 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:15.533 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 566610 ']' 00:26:15.533 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:15.533 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:15.533 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:15.533 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:15.533 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:15.533 13:52:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:15.533 [2024-07-12 13:52:03.986827] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:26:15.533 [2024-07-12 13:52:03.986892] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid566610 ] 00:26:15.533 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:15.533 Zero copy mechanism will not be used. 00:26:15.802 [2024-07-12 13:52:04.115288] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:15.802 [2024-07-12 13:52:04.217705] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:15.802 [2024-07-12 13:52:04.279949] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:15.802 [2024-07-12 13:52:04.279988] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:16.370 13:52:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:16.370 13:52:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:26:16.370 13:52:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:16.370 13:52:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:16.628 BaseBdev1_malloc 00:26:16.629 13:52:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:16.887 [2024-07-12 13:52:05.372085] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:16.887 [2024-07-12 13:52:05.372134] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.887 [2024-07-12 13:52:05.372158] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c3680 00:26:16.887 [2024-07-12 13:52:05.372171] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.887 [2024-07-12 13:52:05.373940] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.887 [2024-07-12 13:52:05.373969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:16.887 BaseBdev1 00:26:16.887 13:52:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:16.887 13:52:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:17.145 BaseBdev2_malloc 00:26:17.145 13:52:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:17.404 [2024-07-12 13:52:05.862242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:17.404 [2024-07-12 13:52:05.862289] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:17.404 [2024-07-12 13:52:05.862312] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c41a0 00:26:17.404 [2024-07-12 13:52:05.862325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:17.404 [2024-07-12 13:52:05.863883] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:17.404 [2024-07-12 13:52:05.863910] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:17.404 BaseBdev2 00:26:17.404 13:52:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:17.404 13:52:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:17.663 BaseBdev3_malloc 00:26:17.663 13:52:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:17.921 [2024-07-12 13:52:06.360168] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:17.921 [2024-07-12 13:52:06.360213] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:17.921 [2024-07-12 13:52:06.360233] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1871230 00:26:17.921 [2024-07-12 13:52:06.360246] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:17.921 [2024-07-12 13:52:06.361772] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:17.922 [2024-07-12 13:52:06.361800] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:17.922 BaseBdev3 00:26:17.922 13:52:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:17.922 13:52:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:18.181 BaseBdev4_malloc 00:26:18.181 13:52:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:18.440 [2024-07-12 13:52:06.855298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:18.440 [2024-07-12 13:52:06.855347] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:18.440 [2024-07-12 13:52:06.855368] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1870410 00:26:18.440 [2024-07-12 13:52:06.855381] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:18.440 [2024-07-12 13:52:06.856973] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:18.440 [2024-07-12 13:52:06.857002] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:18.440 BaseBdev4 00:26:18.440 13:52:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:18.699 spare_malloc 00:26:18.699 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:18.957 spare_delay 00:26:18.957 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:19.215 [2024-07-12 13:52:07.591049] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:19.215 [2024-07-12 13:52:07.591097] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:19.215 [2024-07-12 13:52:07.591118] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1874ef0 00:26:19.215 [2024-07-12 13:52:07.591130] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:19.215 [2024-07-12 13:52:07.592698] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:19.215 [2024-07-12 13:52:07.592727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:19.215 spare 00:26:19.215 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:19.473 [2024-07-12 13:52:07.831715] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:19.473 [2024-07-12 13:52:07.833033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:19.473 [2024-07-12 13:52:07.833091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:19.473 [2024-07-12 13:52:07.833138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:19.473 [2024-07-12 13:52:07.833335] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17f41e0 00:26:19.473 [2024-07-12 13:52:07.833347] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:19.473 [2024-07-12 13:52:07.833552] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x186e750 00:26:19.473 [2024-07-12 13:52:07.833702] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17f41e0 00:26:19.473 [2024-07-12 13:52:07.833712] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17f41e0 00:26:19.473 [2024-07-12 13:52:07.833809] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.473 13:52:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.731 13:52:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:19.731 "name": "raid_bdev1", 00:26:19.731 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:19.731 "strip_size_kb": 0, 00:26:19.731 "state": "online", 00:26:19.731 "raid_level": "raid1", 00:26:19.731 "superblock": true, 00:26:19.731 "num_base_bdevs": 4, 00:26:19.731 "num_base_bdevs_discovered": 4, 00:26:19.731 "num_base_bdevs_operational": 4, 00:26:19.731 "base_bdevs_list": [ 00:26:19.731 { 00:26:19.731 "name": "BaseBdev1", 00:26:19.731 "uuid": "e7c54d49-f064-5f9e-9759-889198f5a7a4", 00:26:19.731 "is_configured": true, 00:26:19.731 "data_offset": 2048, 00:26:19.731 "data_size": 63488 00:26:19.731 }, 00:26:19.731 { 00:26:19.731 "name": "BaseBdev2", 00:26:19.731 "uuid": "4ee12a93-f033-5f57-bbb5-85ebf2890686", 00:26:19.731 "is_configured": true, 00:26:19.731 "data_offset": 2048, 00:26:19.731 "data_size": 63488 00:26:19.731 }, 00:26:19.731 { 00:26:19.731 "name": "BaseBdev3", 00:26:19.731 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:19.731 "is_configured": true, 00:26:19.731 "data_offset": 2048, 00:26:19.731 "data_size": 63488 00:26:19.731 }, 00:26:19.731 { 00:26:19.731 "name": "BaseBdev4", 00:26:19.731 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:19.731 "is_configured": true, 00:26:19.731 "data_offset": 2048, 00:26:19.731 "data_size": 63488 00:26:19.731 } 00:26:19.731 ] 00:26:19.731 }' 00:26:19.732 13:52:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:19.732 13:52:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:20.298 13:52:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:20.298 13:52:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:20.557 [2024-07-12 13:52:08.926891] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:20.557 13:52:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:26:20.557 13:52:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.557 13:52:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:20.815 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:26:20.815 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:20.815 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:20.815 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:20.815 [2024-07-12 13:52:09.313732] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c2fb0 00:26:20.815 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:20.815 Zero copy mechanism will not be used. 00:26:20.815 Running I/O for 60 seconds... 00:26:21.074 [2024-07-12 13:52:09.431781] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:21.074 [2024-07-12 13:52:09.432019] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x16c2fb0 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.074 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.332 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:21.332 "name": "raid_bdev1", 00:26:21.332 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:21.332 "strip_size_kb": 0, 00:26:21.332 "state": "online", 00:26:21.332 "raid_level": "raid1", 00:26:21.332 "superblock": true, 00:26:21.332 "num_base_bdevs": 4, 00:26:21.332 "num_base_bdevs_discovered": 3, 00:26:21.332 "num_base_bdevs_operational": 3, 00:26:21.332 "base_bdevs_list": [ 00:26:21.332 { 00:26:21.332 "name": null, 00:26:21.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:21.332 "is_configured": false, 00:26:21.332 "data_offset": 2048, 00:26:21.332 "data_size": 63488 00:26:21.332 }, 00:26:21.332 { 00:26:21.332 "name": "BaseBdev2", 00:26:21.332 "uuid": "4ee12a93-f033-5f57-bbb5-85ebf2890686", 00:26:21.332 "is_configured": true, 00:26:21.332 "data_offset": 2048, 00:26:21.332 "data_size": 63488 00:26:21.332 }, 00:26:21.332 { 00:26:21.332 "name": "BaseBdev3", 00:26:21.332 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:21.332 "is_configured": true, 00:26:21.332 "data_offset": 2048, 00:26:21.332 "data_size": 63488 00:26:21.332 }, 00:26:21.332 { 00:26:21.332 "name": "BaseBdev4", 00:26:21.332 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:21.332 "is_configured": true, 00:26:21.332 "data_offset": 2048, 00:26:21.332 "data_size": 63488 00:26:21.332 } 00:26:21.332 ] 00:26:21.332 }' 00:26:21.332 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:21.332 13:52:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:21.899 13:52:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:22.158 [2024-07-12 13:52:10.637141] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:22.158 13:52:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:22.158 [2024-07-12 13:52:10.702787] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f6480 00:26:22.158 [2024-07-12 13:52:10.705125] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:22.415 [2024-07-12 13:52:10.825367] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:22.415 [2024-07-12 13:52:10.825897] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:22.415 [2024-07-12 13:52:10.939734] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:22.415 [2024-07-12 13:52:10.940446] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:22.980 [2024-07-12 13:52:11.288608] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:22.980 [2024-07-12 13:52:11.289969] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:22.980 [2024-07-12 13:52:11.508982] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:22.980 [2024-07-12 13:52:11.509329] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:23.238 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:23.238 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:23.238 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:23.238 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:23.238 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:23.238 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.238 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.496 [2024-07-12 13:52:11.915432] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:23.496 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:23.496 "name": "raid_bdev1", 00:26:23.496 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:23.496 "strip_size_kb": 0, 00:26:23.496 "state": "online", 00:26:23.496 "raid_level": "raid1", 00:26:23.496 "superblock": true, 00:26:23.496 "num_base_bdevs": 4, 00:26:23.496 "num_base_bdevs_discovered": 4, 00:26:23.496 "num_base_bdevs_operational": 4, 00:26:23.496 "process": { 00:26:23.496 "type": "rebuild", 00:26:23.496 "target": "spare", 00:26:23.496 "progress": { 00:26:23.496 "blocks": 14336, 00:26:23.496 "percent": 22 00:26:23.496 } 00:26:23.496 }, 00:26:23.496 "base_bdevs_list": [ 00:26:23.496 { 00:26:23.496 "name": "spare", 00:26:23.496 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:23.496 "is_configured": true, 00:26:23.496 "data_offset": 2048, 00:26:23.496 "data_size": 63488 00:26:23.496 }, 00:26:23.496 { 00:26:23.496 "name": "BaseBdev2", 00:26:23.496 "uuid": "4ee12a93-f033-5f57-bbb5-85ebf2890686", 00:26:23.496 "is_configured": true, 00:26:23.496 "data_offset": 2048, 00:26:23.496 "data_size": 63488 00:26:23.496 }, 00:26:23.496 { 00:26:23.496 "name": "BaseBdev3", 00:26:23.496 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:23.496 "is_configured": true, 00:26:23.496 "data_offset": 2048, 00:26:23.496 "data_size": 63488 00:26:23.496 }, 00:26:23.496 { 00:26:23.496 "name": "BaseBdev4", 00:26:23.496 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:23.496 "is_configured": true, 00:26:23.496 "data_offset": 2048, 00:26:23.496 "data_size": 63488 00:26:23.496 } 00:26:23.496 ] 00:26:23.496 }' 00:26:23.496 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:23.496 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:23.496 13:52:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:23.496 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:23.496 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:23.754 [2024-07-12 13:52:12.271852] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:24.012 [2024-07-12 13:52:12.344920] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:24.012 [2024-07-12 13:52:12.346057] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:24.012 [2024-07-12 13:52:12.447574] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:24.012 [2024-07-12 13:52:12.449514] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:24.012 [2024-07-12 13:52:12.449542] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:24.012 [2024-07-12 13:52:12.449552] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:24.012 [2024-07-12 13:52:12.471186] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x16c2fb0 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.012 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.270 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:24.270 "name": "raid_bdev1", 00:26:24.270 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:24.270 "strip_size_kb": 0, 00:26:24.270 "state": "online", 00:26:24.270 "raid_level": "raid1", 00:26:24.270 "superblock": true, 00:26:24.270 "num_base_bdevs": 4, 00:26:24.270 "num_base_bdevs_discovered": 3, 00:26:24.270 "num_base_bdevs_operational": 3, 00:26:24.270 "base_bdevs_list": [ 00:26:24.270 { 00:26:24.270 "name": null, 00:26:24.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.270 "is_configured": false, 00:26:24.270 "data_offset": 2048, 00:26:24.270 "data_size": 63488 00:26:24.270 }, 00:26:24.270 { 00:26:24.270 "name": "BaseBdev2", 00:26:24.270 "uuid": "4ee12a93-f033-5f57-bbb5-85ebf2890686", 00:26:24.270 "is_configured": true, 00:26:24.270 "data_offset": 2048, 00:26:24.270 "data_size": 63488 00:26:24.270 }, 00:26:24.270 { 00:26:24.270 "name": "BaseBdev3", 00:26:24.270 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:24.270 "is_configured": true, 00:26:24.270 "data_offset": 2048, 00:26:24.270 "data_size": 63488 00:26:24.270 }, 00:26:24.270 { 00:26:24.270 "name": "BaseBdev4", 00:26:24.270 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:24.270 "is_configured": true, 00:26:24.270 "data_offset": 2048, 00:26:24.270 "data_size": 63488 00:26:24.270 } 00:26:24.270 ] 00:26:24.270 }' 00:26:24.270 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:24.270 13:52:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:24.836 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:24.836 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:24.836 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:24.836 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:24.836 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:24.836 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.836 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.093 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:25.093 "name": "raid_bdev1", 00:26:25.093 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:25.093 "strip_size_kb": 0, 00:26:25.093 "state": "online", 00:26:25.093 "raid_level": "raid1", 00:26:25.093 "superblock": true, 00:26:25.093 "num_base_bdevs": 4, 00:26:25.093 "num_base_bdevs_discovered": 3, 00:26:25.093 "num_base_bdevs_operational": 3, 00:26:25.093 "base_bdevs_list": [ 00:26:25.093 { 00:26:25.093 "name": null, 00:26:25.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.094 "is_configured": false, 00:26:25.094 "data_offset": 2048, 00:26:25.094 "data_size": 63488 00:26:25.094 }, 00:26:25.094 { 00:26:25.094 "name": "BaseBdev2", 00:26:25.094 "uuid": "4ee12a93-f033-5f57-bbb5-85ebf2890686", 00:26:25.094 "is_configured": true, 00:26:25.094 "data_offset": 2048, 00:26:25.094 "data_size": 63488 00:26:25.094 }, 00:26:25.094 { 00:26:25.094 "name": "BaseBdev3", 00:26:25.094 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:25.094 "is_configured": true, 00:26:25.094 "data_offset": 2048, 00:26:25.094 "data_size": 63488 00:26:25.094 }, 00:26:25.094 { 00:26:25.094 "name": "BaseBdev4", 00:26:25.094 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:25.094 "is_configured": true, 00:26:25.094 "data_offset": 2048, 00:26:25.094 "data_size": 63488 00:26:25.094 } 00:26:25.094 ] 00:26:25.094 }' 00:26:25.094 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:25.094 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:25.094 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:25.351 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:25.351 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:25.607 [2024-07-12 13:52:13.942146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:25.607 [2024-07-12 13:52:13.981021] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f8230 00:26:25.607 [2024-07-12 13:52:13.982619] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:25.607 13:52:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:25.607 [2024-07-12 13:52:14.114644] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:25.876 [2024-07-12 13:52:14.257459] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:25.876 [2024-07-12 13:52:14.257754] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:26.134 [2024-07-12 13:52:14.633312] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:26.390 [2024-07-12 13:52:14.802000] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:26.390 [2024-07-12 13:52:14.810749] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:26.648 13:52:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:26.648 13:52:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:26.648 13:52:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:26.648 13:52:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:26.648 13:52:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:26.648 13:52:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.648 13:52:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.648 [2024-07-12 13:52:15.200135] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:26.907 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:26.907 "name": "raid_bdev1", 00:26:26.907 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:26.907 "strip_size_kb": 0, 00:26:26.907 "state": "online", 00:26:26.907 "raid_level": "raid1", 00:26:26.907 "superblock": true, 00:26:26.907 "num_base_bdevs": 4, 00:26:26.907 "num_base_bdevs_discovered": 4, 00:26:26.907 "num_base_bdevs_operational": 4, 00:26:26.907 "process": { 00:26:26.907 "type": "rebuild", 00:26:26.907 "target": "spare", 00:26:26.907 "progress": { 00:26:26.907 "blocks": 14336, 00:26:26.907 "percent": 22 00:26:26.907 } 00:26:26.907 }, 00:26:26.907 "base_bdevs_list": [ 00:26:26.907 { 00:26:26.907 "name": "spare", 00:26:26.907 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:26.907 "is_configured": true, 00:26:26.907 "data_offset": 2048, 00:26:26.907 "data_size": 63488 00:26:26.907 }, 00:26:26.907 { 00:26:26.907 "name": "BaseBdev2", 00:26:26.907 "uuid": "4ee12a93-f033-5f57-bbb5-85ebf2890686", 00:26:26.907 "is_configured": true, 00:26:26.908 "data_offset": 2048, 00:26:26.908 "data_size": 63488 00:26:26.908 }, 00:26:26.908 { 00:26:26.908 "name": "BaseBdev3", 00:26:26.908 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:26.908 "is_configured": true, 00:26:26.908 "data_offset": 2048, 00:26:26.908 "data_size": 63488 00:26:26.908 }, 00:26:26.908 { 00:26:26.908 "name": "BaseBdev4", 00:26:26.908 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:26.908 "is_configured": true, 00:26:26.908 "data_offset": 2048, 00:26:26.908 "data_size": 63488 00:26:26.908 } 00:26:26.908 ] 00:26:26.908 }' 00:26:26.908 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:26.908 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:26.908 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:26.908 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:26.908 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:26.908 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:26.908 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:26.908 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:26.908 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:26.908 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:26.908 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:27.165 [2024-07-12 13:52:15.566983] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:27.165 [2024-07-12 13:52:15.575737] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:27.165 [2024-07-12 13:52:15.695280] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x16c2fb0 00:26:27.165 [2024-07-12 13:52:15.695306] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x17f8230 00:26:27.165 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:27.165 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:27.165 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:27.165 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:27.165 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:27.165 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:27.165 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:27.165 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.165 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.422 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:27.422 "name": "raid_bdev1", 00:26:27.422 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:27.422 "strip_size_kb": 0, 00:26:27.422 "state": "online", 00:26:27.422 "raid_level": "raid1", 00:26:27.422 "superblock": true, 00:26:27.422 "num_base_bdevs": 4, 00:26:27.422 "num_base_bdevs_discovered": 3, 00:26:27.422 "num_base_bdevs_operational": 3, 00:26:27.422 "process": { 00:26:27.422 "type": "rebuild", 00:26:27.422 "target": "spare", 00:26:27.422 "progress": { 00:26:27.422 "blocks": 22528, 00:26:27.422 "percent": 35 00:26:27.422 } 00:26:27.422 }, 00:26:27.422 "base_bdevs_list": [ 00:26:27.422 { 00:26:27.422 "name": "spare", 00:26:27.422 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:27.422 "is_configured": true, 00:26:27.422 "data_offset": 2048, 00:26:27.422 "data_size": 63488 00:26:27.422 }, 00:26:27.422 { 00:26:27.422 "name": null, 00:26:27.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.422 "is_configured": false, 00:26:27.422 "data_offset": 2048, 00:26:27.422 "data_size": 63488 00:26:27.422 }, 00:26:27.422 { 00:26:27.422 "name": "BaseBdev3", 00:26:27.422 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:27.422 "is_configured": true, 00:26:27.422 "data_offset": 2048, 00:26:27.422 "data_size": 63488 00:26:27.422 }, 00:26:27.422 { 00:26:27.422 "name": "BaseBdev4", 00:26:27.422 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:27.422 "is_configured": true, 00:26:27.422 "data_offset": 2048, 00:26:27.422 "data_size": 63488 00:26:27.422 } 00:26:27.422 ] 00:26:27.422 }' 00:26:27.422 13:52:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=1000 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.680 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.938 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:27.938 "name": "raid_bdev1", 00:26:27.938 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:27.938 "strip_size_kb": 0, 00:26:27.938 "state": "online", 00:26:27.938 "raid_level": "raid1", 00:26:27.938 "superblock": true, 00:26:27.938 "num_base_bdevs": 4, 00:26:27.938 "num_base_bdevs_discovered": 3, 00:26:27.938 "num_base_bdevs_operational": 3, 00:26:27.938 "process": { 00:26:27.938 "type": "rebuild", 00:26:27.938 "target": "spare", 00:26:27.938 "progress": { 00:26:27.938 "blocks": 28672, 00:26:27.938 "percent": 45 00:26:27.938 } 00:26:27.938 }, 00:26:27.938 "base_bdevs_list": [ 00:26:27.938 { 00:26:27.938 "name": "spare", 00:26:27.938 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:27.938 "is_configured": true, 00:26:27.938 "data_offset": 2048, 00:26:27.938 "data_size": 63488 00:26:27.938 }, 00:26:27.938 { 00:26:27.938 "name": null, 00:26:27.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.938 "is_configured": false, 00:26:27.938 "data_offset": 2048, 00:26:27.938 "data_size": 63488 00:26:27.938 }, 00:26:27.938 { 00:26:27.938 "name": "BaseBdev3", 00:26:27.938 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:27.938 "is_configured": true, 00:26:27.938 "data_offset": 2048, 00:26:27.938 "data_size": 63488 00:26:27.938 }, 00:26:27.938 { 00:26:27.938 "name": "BaseBdev4", 00:26:27.938 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:27.938 "is_configured": true, 00:26:27.938 "data_offset": 2048, 00:26:27.938 "data_size": 63488 00:26:27.938 } 00:26:27.938 ] 00:26:27.938 }' 00:26:27.938 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:27.938 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:27.938 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:27.938 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:27.938 13:52:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:27.938 [2024-07-12 13:52:16.495193] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:28.196 [2024-07-12 13:52:16.719315] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:29.130 13:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:29.130 13:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:29.130 13:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:29.130 13:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:29.130 13:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:29.130 13:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:29.130 13:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.130 13:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.130 [2024-07-12 13:52:17.707862] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:29.387 [2024-07-12 13:52:17.910440] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:29.644 13:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:29.644 "name": "raid_bdev1", 00:26:29.644 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:29.644 "strip_size_kb": 0, 00:26:29.644 "state": "online", 00:26:29.644 "raid_level": "raid1", 00:26:29.644 "superblock": true, 00:26:29.644 "num_base_bdevs": 4, 00:26:29.644 "num_base_bdevs_discovered": 3, 00:26:29.644 "num_base_bdevs_operational": 3, 00:26:29.644 "process": { 00:26:29.644 "type": "rebuild", 00:26:29.644 "target": "spare", 00:26:29.644 "progress": { 00:26:29.644 "blocks": 53248, 00:26:29.644 "percent": 83 00:26:29.644 } 00:26:29.644 }, 00:26:29.644 "base_bdevs_list": [ 00:26:29.644 { 00:26:29.644 "name": "spare", 00:26:29.644 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:29.645 "is_configured": true, 00:26:29.645 "data_offset": 2048, 00:26:29.645 "data_size": 63488 00:26:29.645 }, 00:26:29.645 { 00:26:29.645 "name": null, 00:26:29.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.645 "is_configured": false, 00:26:29.645 "data_offset": 2048, 00:26:29.645 "data_size": 63488 00:26:29.645 }, 00:26:29.645 { 00:26:29.645 "name": "BaseBdev3", 00:26:29.645 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:29.645 "is_configured": true, 00:26:29.645 "data_offset": 2048, 00:26:29.645 "data_size": 63488 00:26:29.645 }, 00:26:29.645 { 00:26:29.645 "name": "BaseBdev4", 00:26:29.645 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:29.645 "is_configured": true, 00:26:29.645 "data_offset": 2048, 00:26:29.645 "data_size": 63488 00:26:29.645 } 00:26:29.645 ] 00:26:29.645 }' 00:26:29.645 13:52:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:29.645 13:52:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:29.645 13:52:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:29.645 13:52:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:29.645 13:52:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:29.645 [2024-07-12 13:52:18.142600] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:26:29.645 [2024-07-12 13:52:18.142938] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:26:30.210 [2024-07-12 13:52:18.589458] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:30.210 [2024-07-12 13:52:18.689730] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:30.210 [2024-07-12 13:52:18.692141] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:30.775 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:30.776 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:30.776 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:30.776 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:30.776 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:30.776 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:30.776 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.776 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:30.776 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:30.776 "name": "raid_bdev1", 00:26:30.776 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:30.776 "strip_size_kb": 0, 00:26:30.776 "state": "online", 00:26:30.776 "raid_level": "raid1", 00:26:30.776 "superblock": true, 00:26:30.776 "num_base_bdevs": 4, 00:26:30.776 "num_base_bdevs_discovered": 3, 00:26:30.776 "num_base_bdevs_operational": 3, 00:26:30.776 "base_bdevs_list": [ 00:26:30.776 { 00:26:30.776 "name": "spare", 00:26:30.776 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:30.776 "is_configured": true, 00:26:30.776 "data_offset": 2048, 00:26:30.776 "data_size": 63488 00:26:30.776 }, 00:26:30.776 { 00:26:30.776 "name": null, 00:26:30.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.776 "is_configured": false, 00:26:30.776 "data_offset": 2048, 00:26:30.776 "data_size": 63488 00:26:30.776 }, 00:26:30.776 { 00:26:30.776 "name": "BaseBdev3", 00:26:30.776 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:30.776 "is_configured": true, 00:26:30.776 "data_offset": 2048, 00:26:30.776 "data_size": 63488 00:26:30.776 }, 00:26:30.776 { 00:26:30.776 "name": "BaseBdev4", 00:26:30.776 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:30.776 "is_configured": true, 00:26:30.776 "data_offset": 2048, 00:26:30.776 "data_size": 63488 00:26:30.776 } 00:26:30.776 ] 00:26:30.776 }' 00:26:30.776 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.034 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.291 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:31.291 "name": "raid_bdev1", 00:26:31.291 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:31.291 "strip_size_kb": 0, 00:26:31.291 "state": "online", 00:26:31.291 "raid_level": "raid1", 00:26:31.291 "superblock": true, 00:26:31.291 "num_base_bdevs": 4, 00:26:31.291 "num_base_bdevs_discovered": 3, 00:26:31.291 "num_base_bdevs_operational": 3, 00:26:31.291 "base_bdevs_list": [ 00:26:31.291 { 00:26:31.291 "name": "spare", 00:26:31.291 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:31.291 "is_configured": true, 00:26:31.291 "data_offset": 2048, 00:26:31.291 "data_size": 63488 00:26:31.291 }, 00:26:31.291 { 00:26:31.291 "name": null, 00:26:31.291 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:31.291 "is_configured": false, 00:26:31.291 "data_offset": 2048, 00:26:31.291 "data_size": 63488 00:26:31.291 }, 00:26:31.291 { 00:26:31.291 "name": "BaseBdev3", 00:26:31.291 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:31.291 "is_configured": true, 00:26:31.291 "data_offset": 2048, 00:26:31.291 "data_size": 63488 00:26:31.291 }, 00:26:31.291 { 00:26:31.291 "name": "BaseBdev4", 00:26:31.291 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:31.291 "is_configured": true, 00:26:31.291 "data_offset": 2048, 00:26:31.291 "data_size": 63488 00:26:31.291 } 00:26:31.292 ] 00:26:31.292 }' 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:31.292 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:31.549 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.549 13:52:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:32.115 13:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:32.115 "name": "raid_bdev1", 00:26:32.115 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:32.115 "strip_size_kb": 0, 00:26:32.115 "state": "online", 00:26:32.115 "raid_level": "raid1", 00:26:32.115 "superblock": true, 00:26:32.115 "num_base_bdevs": 4, 00:26:32.115 "num_base_bdevs_discovered": 3, 00:26:32.115 "num_base_bdevs_operational": 3, 00:26:32.115 "base_bdevs_list": [ 00:26:32.115 { 00:26:32.115 "name": "spare", 00:26:32.115 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:32.115 "is_configured": true, 00:26:32.115 "data_offset": 2048, 00:26:32.115 "data_size": 63488 00:26:32.115 }, 00:26:32.115 { 00:26:32.115 "name": null, 00:26:32.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:32.115 "is_configured": false, 00:26:32.115 "data_offset": 2048, 00:26:32.115 "data_size": 63488 00:26:32.115 }, 00:26:32.115 { 00:26:32.115 "name": "BaseBdev3", 00:26:32.115 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:32.115 "is_configured": true, 00:26:32.115 "data_offset": 2048, 00:26:32.115 "data_size": 63488 00:26:32.115 }, 00:26:32.115 { 00:26:32.115 "name": "BaseBdev4", 00:26:32.115 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:32.115 "is_configured": true, 00:26:32.115 "data_offset": 2048, 00:26:32.115 "data_size": 63488 00:26:32.115 } 00:26:32.115 ] 00:26:32.115 }' 00:26:32.115 13:52:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:32.115 13:52:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:33.049 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:33.049 [2024-07-12 13:52:21.493862] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:33.049 [2024-07-12 13:52:21.493896] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:33.049 00:26:33.049 Latency(us) 00:26:33.049 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:33.049 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:33.049 raid_bdev1 : 12.24 88.75 266.26 0.00 0.00 15935.96 293.84 119446.48 00:26:33.049 =================================================================================================================== 00:26:33.049 Total : 88.75 266.26 0.00 0.00 15935.96 293.84 119446.48 00:26:33.049 [2024-07-12 13:52:21.586117] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:33.049 [2024-07-12 13:52:21.586146] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:33.049 [2024-07-12 13:52:21.586237] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:33.049 [2024-07-12 13:52:21.586249] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f41e0 name raid_bdev1, state offline 00:26:33.049 0 00:26:33.049 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.049 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:33.317 13:52:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:33.600 /dev/nbd0 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:33.600 1+0 records in 00:26:33.600 1+0 records out 00:26:33.600 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273391 s, 15.0 MB/s 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:33.600 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:33.873 /dev/nbd1 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:33.873 1+0 records in 00:26:33.873 1+0 records out 00:26:33.873 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000201385 s, 20.3 MB/s 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:33.873 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:34.439 /dev/nbd1 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:34.439 1+0 records in 00:26:34.439 1+0 records out 00:26:34.439 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249557 s, 16.4 MB/s 00:26:34.439 13:52:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:34.439 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:34.439 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:34.439 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:34.439 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:34.439 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:34.439 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:34.439 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:34.695 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:34.695 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:34.695 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:34.695 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:34.695 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:34.695 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:34.695 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:34.953 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:35.212 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:35.212 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:35.212 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:35.212 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:35.212 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:35.212 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:35.212 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:35.212 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:35.212 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:35.212 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:35.471 13:52:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:35.730 [2024-07-12 13:52:24.128653] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:35.730 [2024-07-12 13:52:24.128700] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:35.730 [2024-07-12 13:52:24.128723] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1873970 00:26:35.730 [2024-07-12 13:52:24.128736] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:35.730 [2024-07-12 13:52:24.130379] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:35.730 [2024-07-12 13:52:24.130409] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:35.730 [2024-07-12 13:52:24.130492] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:35.730 [2024-07-12 13:52:24.130525] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:35.730 [2024-07-12 13:52:24.130629] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:35.730 [2024-07-12 13:52:24.130705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:35.730 spare 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.730 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:35.730 [2024-07-12 13:52:24.231029] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17f56c0 00:26:35.730 [2024-07-12 13:52:24.231051] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:35.730 [2024-07-12 13:52:24.231265] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f4620 00:26:35.730 [2024-07-12 13:52:24.231428] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17f56c0 00:26:35.730 [2024-07-12 13:52:24.231438] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17f56c0 00:26:35.730 [2024-07-12 13:52:24.231553] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:35.989 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:35.989 "name": "raid_bdev1", 00:26:35.989 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:35.989 "strip_size_kb": 0, 00:26:35.989 "state": "online", 00:26:35.989 "raid_level": "raid1", 00:26:35.989 "superblock": true, 00:26:35.989 "num_base_bdevs": 4, 00:26:35.989 "num_base_bdevs_discovered": 3, 00:26:35.989 "num_base_bdevs_operational": 3, 00:26:35.989 "base_bdevs_list": [ 00:26:35.989 { 00:26:35.989 "name": "spare", 00:26:35.989 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:35.989 "is_configured": true, 00:26:35.989 "data_offset": 2048, 00:26:35.989 "data_size": 63488 00:26:35.989 }, 00:26:35.989 { 00:26:35.989 "name": null, 00:26:35.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:35.989 "is_configured": false, 00:26:35.989 "data_offset": 2048, 00:26:35.989 "data_size": 63488 00:26:35.989 }, 00:26:35.989 { 00:26:35.989 "name": "BaseBdev3", 00:26:35.989 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:35.989 "is_configured": true, 00:26:35.989 "data_offset": 2048, 00:26:35.989 "data_size": 63488 00:26:35.989 }, 00:26:35.989 { 00:26:35.989 "name": "BaseBdev4", 00:26:35.989 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:35.989 "is_configured": true, 00:26:35.989 "data_offset": 2048, 00:26:35.989 "data_size": 63488 00:26:35.989 } 00:26:35.990 ] 00:26:35.990 }' 00:26:35.990 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:35.990 13:52:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:36.555 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:36.555 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:36.555 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:36.555 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:36.555 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:36.555 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.555 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.814 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:36.814 "name": "raid_bdev1", 00:26:36.814 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:36.814 "strip_size_kb": 0, 00:26:36.814 "state": "online", 00:26:36.814 "raid_level": "raid1", 00:26:36.814 "superblock": true, 00:26:36.814 "num_base_bdevs": 4, 00:26:36.814 "num_base_bdevs_discovered": 3, 00:26:36.814 "num_base_bdevs_operational": 3, 00:26:36.814 "base_bdevs_list": [ 00:26:36.814 { 00:26:36.814 "name": "spare", 00:26:36.814 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:36.814 "is_configured": true, 00:26:36.814 "data_offset": 2048, 00:26:36.814 "data_size": 63488 00:26:36.814 }, 00:26:36.814 { 00:26:36.814 "name": null, 00:26:36.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:36.814 "is_configured": false, 00:26:36.814 "data_offset": 2048, 00:26:36.814 "data_size": 63488 00:26:36.814 }, 00:26:36.814 { 00:26:36.814 "name": "BaseBdev3", 00:26:36.814 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:36.814 "is_configured": true, 00:26:36.814 "data_offset": 2048, 00:26:36.814 "data_size": 63488 00:26:36.814 }, 00:26:36.814 { 00:26:36.814 "name": "BaseBdev4", 00:26:36.814 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:36.814 "is_configured": true, 00:26:36.814 "data_offset": 2048, 00:26:36.814 "data_size": 63488 00:26:36.814 } 00:26:36.814 ] 00:26:36.814 }' 00:26:36.814 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:37.072 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:37.072 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:37.072 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:37.072 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.072 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:37.331 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:37.331 13:52:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:37.591 [2024-07-12 13:52:26.098286] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.591 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.160 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:38.160 "name": "raid_bdev1", 00:26:38.160 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:38.160 "strip_size_kb": 0, 00:26:38.160 "state": "online", 00:26:38.160 "raid_level": "raid1", 00:26:38.160 "superblock": true, 00:26:38.160 "num_base_bdevs": 4, 00:26:38.160 "num_base_bdevs_discovered": 2, 00:26:38.160 "num_base_bdevs_operational": 2, 00:26:38.160 "base_bdevs_list": [ 00:26:38.160 { 00:26:38.160 "name": null, 00:26:38.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.160 "is_configured": false, 00:26:38.160 "data_offset": 2048, 00:26:38.160 "data_size": 63488 00:26:38.160 }, 00:26:38.160 { 00:26:38.160 "name": null, 00:26:38.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.160 "is_configured": false, 00:26:38.160 "data_offset": 2048, 00:26:38.160 "data_size": 63488 00:26:38.160 }, 00:26:38.160 { 00:26:38.160 "name": "BaseBdev3", 00:26:38.160 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:38.160 "is_configured": true, 00:26:38.160 "data_offset": 2048, 00:26:38.160 "data_size": 63488 00:26:38.160 }, 00:26:38.160 { 00:26:38.160 "name": "BaseBdev4", 00:26:38.160 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:38.160 "is_configured": true, 00:26:38.160 "data_offset": 2048, 00:26:38.160 "data_size": 63488 00:26:38.160 } 00:26:38.160 ] 00:26:38.160 }' 00:26:38.160 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:38.160 13:52:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:38.728 13:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:38.987 [2024-07-12 13:52:27.490183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:38.987 [2024-07-12 13:52:27.490335] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:38.987 [2024-07-12 13:52:27.490352] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:38.987 [2024-07-12 13:52:27.490381] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:38.987 [2024-07-12 13:52:27.494815] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c2690 00:26:38.987 [2024-07-12 13:52:27.497079] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:38.987 13:52:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:40.365 "name": "raid_bdev1", 00:26:40.365 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:40.365 "strip_size_kb": 0, 00:26:40.365 "state": "online", 00:26:40.365 "raid_level": "raid1", 00:26:40.365 "superblock": true, 00:26:40.365 "num_base_bdevs": 4, 00:26:40.365 "num_base_bdevs_discovered": 3, 00:26:40.365 "num_base_bdevs_operational": 3, 00:26:40.365 "process": { 00:26:40.365 "type": "rebuild", 00:26:40.365 "target": "spare", 00:26:40.365 "progress": { 00:26:40.365 "blocks": 24576, 00:26:40.365 "percent": 38 00:26:40.365 } 00:26:40.365 }, 00:26:40.365 "base_bdevs_list": [ 00:26:40.365 { 00:26:40.365 "name": "spare", 00:26:40.365 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:40.365 "is_configured": true, 00:26:40.365 "data_offset": 2048, 00:26:40.365 "data_size": 63488 00:26:40.365 }, 00:26:40.365 { 00:26:40.365 "name": null, 00:26:40.365 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.365 "is_configured": false, 00:26:40.365 "data_offset": 2048, 00:26:40.365 "data_size": 63488 00:26:40.365 }, 00:26:40.365 { 00:26:40.365 "name": "BaseBdev3", 00:26:40.365 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:40.365 "is_configured": true, 00:26:40.365 "data_offset": 2048, 00:26:40.365 "data_size": 63488 00:26:40.365 }, 00:26:40.365 { 00:26:40.365 "name": "BaseBdev4", 00:26:40.365 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:40.365 "is_configured": true, 00:26:40.365 "data_offset": 2048, 00:26:40.365 "data_size": 63488 00:26:40.365 } 00:26:40.365 ] 00:26:40.365 }' 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:40.365 13:52:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:40.933 [2024-07-12 13:52:29.356633] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:40.933 [2024-07-12 13:52:29.412415] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:40.933 [2024-07-12 13:52:29.412461] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:40.933 [2024-07-12 13:52:29.412477] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:40.933 [2024-07-12 13:52:29.412485] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.933 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.501 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:41.501 "name": "raid_bdev1", 00:26:41.501 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:41.501 "strip_size_kb": 0, 00:26:41.501 "state": "online", 00:26:41.501 "raid_level": "raid1", 00:26:41.501 "superblock": true, 00:26:41.501 "num_base_bdevs": 4, 00:26:41.501 "num_base_bdevs_discovered": 2, 00:26:41.501 "num_base_bdevs_operational": 2, 00:26:41.501 "base_bdevs_list": [ 00:26:41.501 { 00:26:41.501 "name": null, 00:26:41.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.501 "is_configured": false, 00:26:41.501 "data_offset": 2048, 00:26:41.501 "data_size": 63488 00:26:41.501 }, 00:26:41.501 { 00:26:41.501 "name": null, 00:26:41.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.501 "is_configured": false, 00:26:41.501 "data_offset": 2048, 00:26:41.501 "data_size": 63488 00:26:41.501 }, 00:26:41.501 { 00:26:41.501 "name": "BaseBdev3", 00:26:41.501 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:41.501 "is_configured": true, 00:26:41.501 "data_offset": 2048, 00:26:41.501 "data_size": 63488 00:26:41.501 }, 00:26:41.501 { 00:26:41.501 "name": "BaseBdev4", 00:26:41.501 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:41.501 "is_configured": true, 00:26:41.501 "data_offset": 2048, 00:26:41.501 "data_size": 63488 00:26:41.501 } 00:26:41.501 ] 00:26:41.501 }' 00:26:41.501 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:41.501 13:52:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:42.069 13:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:42.328 [2024-07-12 13:52:30.728751] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:42.328 [2024-07-12 13:52:30.728803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:42.328 [2024-07-12 13:52:30.728826] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x186dab0 00:26:42.328 [2024-07-12 13:52:30.728844] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:42.328 [2024-07-12 13:52:30.729229] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:42.328 [2024-07-12 13:52:30.729250] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:42.328 [2024-07-12 13:52:30.729334] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:42.328 [2024-07-12 13:52:30.729346] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:42.328 [2024-07-12 13:52:30.729357] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:42.328 [2024-07-12 13:52:30.729376] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:42.328 [2024-07-12 13:52:30.733795] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f4910 00:26:42.328 spare 00:26:42.328 [2024-07-12 13:52:30.735290] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:42.328 13:52:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:43.264 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:43.264 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:43.264 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:43.264 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:43.264 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:43.264 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.264 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:43.523 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:43.523 "name": "raid_bdev1", 00:26:43.523 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:43.523 "strip_size_kb": 0, 00:26:43.523 "state": "online", 00:26:43.523 "raid_level": "raid1", 00:26:43.523 "superblock": true, 00:26:43.523 "num_base_bdevs": 4, 00:26:43.523 "num_base_bdevs_discovered": 3, 00:26:43.523 "num_base_bdevs_operational": 3, 00:26:43.523 "process": { 00:26:43.523 "type": "rebuild", 00:26:43.523 "target": "spare", 00:26:43.523 "progress": { 00:26:43.523 "blocks": 22528, 00:26:43.523 "percent": 35 00:26:43.523 } 00:26:43.523 }, 00:26:43.523 "base_bdevs_list": [ 00:26:43.523 { 00:26:43.523 "name": "spare", 00:26:43.523 "uuid": "286153a7-ad18-5d26-abc3-03098af851cf", 00:26:43.523 "is_configured": true, 00:26:43.523 "data_offset": 2048, 00:26:43.523 "data_size": 63488 00:26:43.523 }, 00:26:43.523 { 00:26:43.523 "name": null, 00:26:43.523 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:43.523 "is_configured": false, 00:26:43.523 "data_offset": 2048, 00:26:43.523 "data_size": 63488 00:26:43.523 }, 00:26:43.523 { 00:26:43.523 "name": "BaseBdev3", 00:26:43.523 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:43.523 "is_configured": true, 00:26:43.523 "data_offset": 2048, 00:26:43.523 "data_size": 63488 00:26:43.523 }, 00:26:43.523 { 00:26:43.523 "name": "BaseBdev4", 00:26:43.523 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:43.523 "is_configured": true, 00:26:43.523 "data_offset": 2048, 00:26:43.523 "data_size": 63488 00:26:43.523 } 00:26:43.523 ] 00:26:43.523 }' 00:26:43.523 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:43.523 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:43.523 13:52:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:43.523 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:43.523 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:43.782 [2024-07-12 13:52:32.250395] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:43.782 [2024-07-12 13:52:32.347911] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:43.782 [2024-07-12 13:52:32.347973] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:43.782 [2024-07-12 13:52:32.347990] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:43.782 [2024-07-12 13:52:32.347999] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.041 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.300 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:44.300 "name": "raid_bdev1", 00:26:44.300 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:44.300 "strip_size_kb": 0, 00:26:44.300 "state": "online", 00:26:44.300 "raid_level": "raid1", 00:26:44.300 "superblock": true, 00:26:44.300 "num_base_bdevs": 4, 00:26:44.300 "num_base_bdevs_discovered": 2, 00:26:44.300 "num_base_bdevs_operational": 2, 00:26:44.300 "base_bdevs_list": [ 00:26:44.300 { 00:26:44.300 "name": null, 00:26:44.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:44.300 "is_configured": false, 00:26:44.300 "data_offset": 2048, 00:26:44.300 "data_size": 63488 00:26:44.300 }, 00:26:44.300 { 00:26:44.300 "name": null, 00:26:44.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:44.300 "is_configured": false, 00:26:44.300 "data_offset": 2048, 00:26:44.300 "data_size": 63488 00:26:44.300 }, 00:26:44.300 { 00:26:44.300 "name": "BaseBdev3", 00:26:44.300 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:44.300 "is_configured": true, 00:26:44.300 "data_offset": 2048, 00:26:44.300 "data_size": 63488 00:26:44.300 }, 00:26:44.300 { 00:26:44.300 "name": "BaseBdev4", 00:26:44.300 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:44.300 "is_configured": true, 00:26:44.300 "data_offset": 2048, 00:26:44.300 "data_size": 63488 00:26:44.300 } 00:26:44.300 ] 00:26:44.300 }' 00:26:44.301 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:44.301 13:52:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:44.867 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:44.867 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:44.867 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:44.867 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:44.867 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:44.867 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.867 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.126 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:45.126 "name": "raid_bdev1", 00:26:45.126 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:45.126 "strip_size_kb": 0, 00:26:45.126 "state": "online", 00:26:45.126 "raid_level": "raid1", 00:26:45.126 "superblock": true, 00:26:45.126 "num_base_bdevs": 4, 00:26:45.126 "num_base_bdevs_discovered": 2, 00:26:45.126 "num_base_bdevs_operational": 2, 00:26:45.126 "base_bdevs_list": [ 00:26:45.126 { 00:26:45.126 "name": null, 00:26:45.126 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.126 "is_configured": false, 00:26:45.126 "data_offset": 2048, 00:26:45.126 "data_size": 63488 00:26:45.126 }, 00:26:45.126 { 00:26:45.126 "name": null, 00:26:45.126 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.126 "is_configured": false, 00:26:45.126 "data_offset": 2048, 00:26:45.126 "data_size": 63488 00:26:45.126 }, 00:26:45.126 { 00:26:45.126 "name": "BaseBdev3", 00:26:45.126 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:45.126 "is_configured": true, 00:26:45.126 "data_offset": 2048, 00:26:45.126 "data_size": 63488 00:26:45.126 }, 00:26:45.126 { 00:26:45.126 "name": "BaseBdev4", 00:26:45.126 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:45.126 "is_configured": true, 00:26:45.126 "data_offset": 2048, 00:26:45.126 "data_size": 63488 00:26:45.126 } 00:26:45.126 ] 00:26:45.126 }' 00:26:45.126 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:45.126 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:45.126 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:45.126 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:45.126 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:45.385 13:52:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:45.643 [2024-07-12 13:52:34.113389] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:45.643 [2024-07-12 13:52:34.113436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:45.643 [2024-07-12 13:52:34.113455] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c2830 00:26:45.643 [2024-07-12 13:52:34.113468] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:45.643 [2024-07-12 13:52:34.113806] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:45.643 [2024-07-12 13:52:34.113825] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:45.643 [2024-07-12 13:52:34.113893] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:45.644 [2024-07-12 13:52:34.113906] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:45.644 [2024-07-12 13:52:34.113916] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:45.644 BaseBdev1 00:26:45.644 13:52:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.581 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.841 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:46.841 "name": "raid_bdev1", 00:26:46.841 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:46.841 "strip_size_kb": 0, 00:26:46.841 "state": "online", 00:26:46.841 "raid_level": "raid1", 00:26:46.841 "superblock": true, 00:26:46.841 "num_base_bdevs": 4, 00:26:46.841 "num_base_bdevs_discovered": 2, 00:26:46.841 "num_base_bdevs_operational": 2, 00:26:46.841 "base_bdevs_list": [ 00:26:46.841 { 00:26:46.841 "name": null, 00:26:46.841 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.841 "is_configured": false, 00:26:46.841 "data_offset": 2048, 00:26:46.841 "data_size": 63488 00:26:46.841 }, 00:26:46.841 { 00:26:46.841 "name": null, 00:26:46.841 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.841 "is_configured": false, 00:26:46.841 "data_offset": 2048, 00:26:46.841 "data_size": 63488 00:26:46.841 }, 00:26:46.841 { 00:26:46.841 "name": "BaseBdev3", 00:26:46.841 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:46.841 "is_configured": true, 00:26:46.841 "data_offset": 2048, 00:26:46.841 "data_size": 63488 00:26:46.841 }, 00:26:46.841 { 00:26:46.841 "name": "BaseBdev4", 00:26:46.841 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:46.841 "is_configured": true, 00:26:46.841 "data_offset": 2048, 00:26:46.841 "data_size": 63488 00:26:46.841 } 00:26:46.841 ] 00:26:46.841 }' 00:26:46.841 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:46.841 13:52:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:47.779 "name": "raid_bdev1", 00:26:47.779 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:47.779 "strip_size_kb": 0, 00:26:47.779 "state": "online", 00:26:47.779 "raid_level": "raid1", 00:26:47.779 "superblock": true, 00:26:47.779 "num_base_bdevs": 4, 00:26:47.779 "num_base_bdevs_discovered": 2, 00:26:47.779 "num_base_bdevs_operational": 2, 00:26:47.779 "base_bdevs_list": [ 00:26:47.779 { 00:26:47.779 "name": null, 00:26:47.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:47.779 "is_configured": false, 00:26:47.779 "data_offset": 2048, 00:26:47.779 "data_size": 63488 00:26:47.779 }, 00:26:47.779 { 00:26:47.779 "name": null, 00:26:47.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:47.779 "is_configured": false, 00:26:47.779 "data_offset": 2048, 00:26:47.779 "data_size": 63488 00:26:47.779 }, 00:26:47.779 { 00:26:47.779 "name": "BaseBdev3", 00:26:47.779 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:47.779 "is_configured": true, 00:26:47.779 "data_offset": 2048, 00:26:47.779 "data_size": 63488 00:26:47.779 }, 00:26:47.779 { 00:26:47.779 "name": "BaseBdev4", 00:26:47.779 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:47.779 "is_configured": true, 00:26:47.779 "data_offset": 2048, 00:26:47.779 "data_size": 63488 00:26:47.779 } 00:26:47.779 ] 00:26:47.779 }' 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:47.779 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:48.039 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:48.039 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:48.039 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:48.039 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:48.039 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:48.039 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:48.039 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:48.039 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:48.039 [2024-07-12 13:52:36.592303] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:48.039 [2024-07-12 13:52:36.592425] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:48.039 [2024-07-12 13:52:36.592440] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:48.039 request: 00:26:48.039 { 00:26:48.039 "base_bdev": "BaseBdev1", 00:26:48.039 "raid_bdev": "raid_bdev1", 00:26:48.039 "method": "bdev_raid_add_base_bdev", 00:26:48.039 "req_id": 1 00:26:48.039 } 00:26:48.039 Got JSON-RPC error response 00:26:48.039 response: 00:26:48.039 { 00:26:48.039 "code": -22, 00:26:48.039 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:48.039 } 00:26:48.297 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:26:48.297 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:48.297 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:48.297 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:48.297 13:52:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.233 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:49.493 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:49.493 "name": "raid_bdev1", 00:26:49.493 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:49.493 "strip_size_kb": 0, 00:26:49.493 "state": "online", 00:26:49.493 "raid_level": "raid1", 00:26:49.493 "superblock": true, 00:26:49.493 "num_base_bdevs": 4, 00:26:49.493 "num_base_bdevs_discovered": 2, 00:26:49.493 "num_base_bdevs_operational": 2, 00:26:49.493 "base_bdevs_list": [ 00:26:49.493 { 00:26:49.493 "name": null, 00:26:49.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:49.493 "is_configured": false, 00:26:49.493 "data_offset": 2048, 00:26:49.493 "data_size": 63488 00:26:49.493 }, 00:26:49.493 { 00:26:49.493 "name": null, 00:26:49.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:49.493 "is_configured": false, 00:26:49.493 "data_offset": 2048, 00:26:49.493 "data_size": 63488 00:26:49.493 }, 00:26:49.493 { 00:26:49.493 "name": "BaseBdev3", 00:26:49.493 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:49.493 "is_configured": true, 00:26:49.493 "data_offset": 2048, 00:26:49.493 "data_size": 63488 00:26:49.493 }, 00:26:49.493 { 00:26:49.493 "name": "BaseBdev4", 00:26:49.493 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:49.493 "is_configured": true, 00:26:49.493 "data_offset": 2048, 00:26:49.493 "data_size": 63488 00:26:49.493 } 00:26:49.493 ] 00:26:49.493 }' 00:26:49.493 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:49.493 13:52:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:50.062 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:50.062 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:50.062 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:50.062 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:50.062 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:50.062 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.062 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:50.322 "name": "raid_bdev1", 00:26:50.322 "uuid": "23921a1f-7c09-436b-8de3-4df0b12b742d", 00:26:50.322 "strip_size_kb": 0, 00:26:50.322 "state": "online", 00:26:50.322 "raid_level": "raid1", 00:26:50.322 "superblock": true, 00:26:50.322 "num_base_bdevs": 4, 00:26:50.322 "num_base_bdevs_discovered": 2, 00:26:50.322 "num_base_bdevs_operational": 2, 00:26:50.322 "base_bdevs_list": [ 00:26:50.322 { 00:26:50.322 "name": null, 00:26:50.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.322 "is_configured": false, 00:26:50.322 "data_offset": 2048, 00:26:50.322 "data_size": 63488 00:26:50.322 }, 00:26:50.322 { 00:26:50.322 "name": null, 00:26:50.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.322 "is_configured": false, 00:26:50.322 "data_offset": 2048, 00:26:50.322 "data_size": 63488 00:26:50.322 }, 00:26:50.322 { 00:26:50.322 "name": "BaseBdev3", 00:26:50.322 "uuid": "6c1388d2-2067-5582-befb-c00ae2e780ac", 00:26:50.322 "is_configured": true, 00:26:50.322 "data_offset": 2048, 00:26:50.322 "data_size": 63488 00:26:50.322 }, 00:26:50.322 { 00:26:50.322 "name": "BaseBdev4", 00:26:50.322 "uuid": "8bfefd85-880d-5d96-87ce-f26436229386", 00:26:50.322 "is_configured": true, 00:26:50.322 "data_offset": 2048, 00:26:50.322 "data_size": 63488 00:26:50.322 } 00:26:50.322 ] 00:26:50.322 }' 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 566610 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 566610 ']' 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 566610 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:50.322 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 566610 00:26:50.581 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:50.581 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:50.581 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 566610' 00:26:50.581 killing process with pid 566610 00:26:50.581 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 566610 00:26:50.581 Received shutdown signal, test time was about 29.533328 seconds 00:26:50.581 00:26:50.581 Latency(us) 00:26:50.581 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:50.581 =================================================================================================================== 00:26:50.581 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:50.581 [2024-07-12 13:52:38.921557] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:50.581 [2024-07-12 13:52:38.921665] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:50.581 13:52:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 566610 00:26:50.581 [2024-07-12 13:52:38.921730] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:50.581 [2024-07-12 13:52:38.921745] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f56c0 name raid_bdev1, state offline 00:26:50.581 [2024-07-12 13:52:38.968436] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:50.840 13:52:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:50.840 00:26:50.840 real 0m35.287s 00:26:50.840 user 0m56.442s 00:26:50.840 sys 0m5.401s 00:26:50.840 13:52:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:50.840 13:52:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:50.840 ************************************ 00:26:50.840 END TEST raid_rebuild_test_sb_io 00:26:50.840 ************************************ 00:26:50.840 13:52:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:50.840 13:52:39 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:26:50.840 13:52:39 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:26:50.840 13:52:39 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:26:50.840 13:52:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:50.840 13:52:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:50.840 13:52:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:50.840 ************************************ 00:26:50.840 START TEST raid_state_function_test_sb_4k 00:26:50.840 ************************************ 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=571622 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 571622' 00:26:50.840 Process raid pid: 571622 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 571622 /var/tmp/spdk-raid.sock 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 571622 ']' 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:50.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:50.840 13:52:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:50.840 [2024-07-12 13:52:39.364097] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:26:50.840 [2024-07-12 13:52:39.364169] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:51.099 [2024-07-12 13:52:39.493469] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:51.099 [2024-07-12 13:52:39.595993] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:51.099 [2024-07-12 13:52:39.661139] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:51.099 [2024-07-12 13:52:39.661169] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:52.036 [2024-07-12 13:52:40.528550] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:52.036 [2024-07-12 13:52:40.528597] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:52.036 [2024-07-12 13:52:40.528608] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:52.036 [2024-07-12 13:52:40.528620] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.036 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:52.295 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:52.295 "name": "Existed_Raid", 00:26:52.295 "uuid": "c84ceea9-489d-458e-b6d1-71b00f9f6c05", 00:26:52.295 "strip_size_kb": 0, 00:26:52.295 "state": "configuring", 00:26:52.295 "raid_level": "raid1", 00:26:52.295 "superblock": true, 00:26:52.295 "num_base_bdevs": 2, 00:26:52.295 "num_base_bdevs_discovered": 0, 00:26:52.295 "num_base_bdevs_operational": 2, 00:26:52.295 "base_bdevs_list": [ 00:26:52.295 { 00:26:52.295 "name": "BaseBdev1", 00:26:52.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.295 "is_configured": false, 00:26:52.295 "data_offset": 0, 00:26:52.295 "data_size": 0 00:26:52.295 }, 00:26:52.295 { 00:26:52.295 "name": "BaseBdev2", 00:26:52.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.295 "is_configured": false, 00:26:52.295 "data_offset": 0, 00:26:52.295 "data_size": 0 00:26:52.295 } 00:26:52.295 ] 00:26:52.295 }' 00:26:52.295 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:52.295 13:52:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:52.864 13:52:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:53.123 [2024-07-12 13:52:41.587205] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:53.123 [2024-07-12 13:52:41.587237] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a53330 name Existed_Raid, state configuring 00:26:53.123 13:52:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:53.383 [2024-07-12 13:52:41.847909] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:53.383 [2024-07-12 13:52:41.847943] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:53.383 [2024-07-12 13:52:41.847953] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:53.383 [2024-07-12 13:52:41.847964] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:53.383 13:52:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:26:53.643 [2024-07-12 13:52:42.114503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:53.643 BaseBdev1 00:26:53.643 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:53.643 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:53.643 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:53.643 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:53.643 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:53.643 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:53.643 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:53.903 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:54.163 [ 00:26:54.163 { 00:26:54.163 "name": "BaseBdev1", 00:26:54.163 "aliases": [ 00:26:54.163 "0d3c9922-696b-4d28-a1fd-0e2c44b7f475" 00:26:54.163 ], 00:26:54.163 "product_name": "Malloc disk", 00:26:54.163 "block_size": 4096, 00:26:54.163 "num_blocks": 8192, 00:26:54.163 "uuid": "0d3c9922-696b-4d28-a1fd-0e2c44b7f475", 00:26:54.163 "assigned_rate_limits": { 00:26:54.163 "rw_ios_per_sec": 0, 00:26:54.163 "rw_mbytes_per_sec": 0, 00:26:54.163 "r_mbytes_per_sec": 0, 00:26:54.163 "w_mbytes_per_sec": 0 00:26:54.163 }, 00:26:54.163 "claimed": true, 00:26:54.163 "claim_type": "exclusive_write", 00:26:54.163 "zoned": false, 00:26:54.163 "supported_io_types": { 00:26:54.163 "read": true, 00:26:54.163 "write": true, 00:26:54.163 "unmap": true, 00:26:54.163 "flush": true, 00:26:54.163 "reset": true, 00:26:54.163 "nvme_admin": false, 00:26:54.163 "nvme_io": false, 00:26:54.163 "nvme_io_md": false, 00:26:54.163 "write_zeroes": true, 00:26:54.163 "zcopy": true, 00:26:54.163 "get_zone_info": false, 00:26:54.163 "zone_management": false, 00:26:54.163 "zone_append": false, 00:26:54.163 "compare": false, 00:26:54.163 "compare_and_write": false, 00:26:54.163 "abort": true, 00:26:54.163 "seek_hole": false, 00:26:54.163 "seek_data": false, 00:26:54.163 "copy": true, 00:26:54.163 "nvme_iov_md": false 00:26:54.163 }, 00:26:54.163 "memory_domains": [ 00:26:54.163 { 00:26:54.163 "dma_device_id": "system", 00:26:54.163 "dma_device_type": 1 00:26:54.163 }, 00:26:54.163 { 00:26:54.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:54.163 "dma_device_type": 2 00:26:54.163 } 00:26:54.163 ], 00:26:54.163 "driver_specific": {} 00:26:54.163 } 00:26:54.163 ] 00:26:54.163 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:54.163 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:54.163 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:54.163 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:54.164 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:54.164 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:54.164 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:54.164 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:54.164 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:54.164 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:54.164 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:54.164 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.164 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:54.423 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:54.423 "name": "Existed_Raid", 00:26:54.423 "uuid": "2799638b-f813-405a-ad5e-90cd715ba1cb", 00:26:54.423 "strip_size_kb": 0, 00:26:54.423 "state": "configuring", 00:26:54.423 "raid_level": "raid1", 00:26:54.423 "superblock": true, 00:26:54.423 "num_base_bdevs": 2, 00:26:54.423 "num_base_bdevs_discovered": 1, 00:26:54.423 "num_base_bdevs_operational": 2, 00:26:54.423 "base_bdevs_list": [ 00:26:54.424 { 00:26:54.424 "name": "BaseBdev1", 00:26:54.424 "uuid": "0d3c9922-696b-4d28-a1fd-0e2c44b7f475", 00:26:54.424 "is_configured": true, 00:26:54.424 "data_offset": 256, 00:26:54.424 "data_size": 7936 00:26:54.424 }, 00:26:54.424 { 00:26:54.424 "name": "BaseBdev2", 00:26:54.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:54.424 "is_configured": false, 00:26:54.424 "data_offset": 0, 00:26:54.424 "data_size": 0 00:26:54.424 } 00:26:54.424 ] 00:26:54.424 }' 00:26:54.424 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:54.424 13:52:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:54.990 13:52:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:55.248 [2024-07-12 13:52:43.722885] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:55.248 [2024-07-12 13:52:43.722932] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a52c20 name Existed_Raid, state configuring 00:26:55.248 13:52:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:55.507 [2024-07-12 13:52:43.983616] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:55.507 [2024-07-12 13:52:43.985108] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:55.507 [2024-07-12 13:52:43.985144] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.507 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:55.765 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:55.765 "name": "Existed_Raid", 00:26:55.765 "uuid": "ec4305a7-a9a2-4af6-bbd2-16cf68d546fd", 00:26:55.765 "strip_size_kb": 0, 00:26:55.765 "state": "configuring", 00:26:55.765 "raid_level": "raid1", 00:26:55.765 "superblock": true, 00:26:55.765 "num_base_bdevs": 2, 00:26:55.765 "num_base_bdevs_discovered": 1, 00:26:55.765 "num_base_bdevs_operational": 2, 00:26:55.765 "base_bdevs_list": [ 00:26:55.765 { 00:26:55.765 "name": "BaseBdev1", 00:26:55.765 "uuid": "0d3c9922-696b-4d28-a1fd-0e2c44b7f475", 00:26:55.765 "is_configured": true, 00:26:55.765 "data_offset": 256, 00:26:55.765 "data_size": 7936 00:26:55.765 }, 00:26:55.765 { 00:26:55.765 "name": "BaseBdev2", 00:26:55.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.765 "is_configured": false, 00:26:55.765 "data_offset": 0, 00:26:55.765 "data_size": 0 00:26:55.765 } 00:26:55.765 ] 00:26:55.765 }' 00:26:55.765 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:55.765 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:56.334 13:52:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:26:56.593 [2024-07-12 13:52:45.113981] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:56.593 [2024-07-12 13:52:45.114137] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a53a10 00:26:56.593 [2024-07-12 13:52:45.114151] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:56.593 [2024-07-12 13:52:45.114329] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a52b70 00:26:56.593 [2024-07-12 13:52:45.114460] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a53a10 00:26:56.593 [2024-07-12 13:52:45.114470] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a53a10 00:26:56.593 [2024-07-12 13:52:45.114563] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:56.593 BaseBdev2 00:26:56.593 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:56.593 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:56.593 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:56.593 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:56.593 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:56.593 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:56.593 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:56.853 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:57.112 [ 00:26:57.112 { 00:26:57.112 "name": "BaseBdev2", 00:26:57.112 "aliases": [ 00:26:57.112 "9c056747-dbab-4023-b680-05a3df307b85" 00:26:57.112 ], 00:26:57.112 "product_name": "Malloc disk", 00:26:57.112 "block_size": 4096, 00:26:57.112 "num_blocks": 8192, 00:26:57.112 "uuid": "9c056747-dbab-4023-b680-05a3df307b85", 00:26:57.112 "assigned_rate_limits": { 00:26:57.112 "rw_ios_per_sec": 0, 00:26:57.112 "rw_mbytes_per_sec": 0, 00:26:57.112 "r_mbytes_per_sec": 0, 00:26:57.112 "w_mbytes_per_sec": 0 00:26:57.112 }, 00:26:57.112 "claimed": true, 00:26:57.112 "claim_type": "exclusive_write", 00:26:57.112 "zoned": false, 00:26:57.112 "supported_io_types": { 00:26:57.112 "read": true, 00:26:57.112 "write": true, 00:26:57.112 "unmap": true, 00:26:57.112 "flush": true, 00:26:57.112 "reset": true, 00:26:57.112 "nvme_admin": false, 00:26:57.112 "nvme_io": false, 00:26:57.112 "nvme_io_md": false, 00:26:57.112 "write_zeroes": true, 00:26:57.112 "zcopy": true, 00:26:57.112 "get_zone_info": false, 00:26:57.112 "zone_management": false, 00:26:57.112 "zone_append": false, 00:26:57.112 "compare": false, 00:26:57.112 "compare_and_write": false, 00:26:57.112 "abort": true, 00:26:57.112 "seek_hole": false, 00:26:57.112 "seek_data": false, 00:26:57.112 "copy": true, 00:26:57.112 "nvme_iov_md": false 00:26:57.112 }, 00:26:57.112 "memory_domains": [ 00:26:57.112 { 00:26:57.112 "dma_device_id": "system", 00:26:57.112 "dma_device_type": 1 00:26:57.112 }, 00:26:57.112 { 00:26:57.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:57.112 "dma_device_type": 2 00:26:57.112 } 00:26:57.112 ], 00:26:57.112 "driver_specific": {} 00:26:57.112 } 00:26:57.112 ] 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.112 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:57.371 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.371 "name": "Existed_Raid", 00:26:57.371 "uuid": "ec4305a7-a9a2-4af6-bbd2-16cf68d546fd", 00:26:57.371 "strip_size_kb": 0, 00:26:57.371 "state": "online", 00:26:57.371 "raid_level": "raid1", 00:26:57.371 "superblock": true, 00:26:57.371 "num_base_bdevs": 2, 00:26:57.371 "num_base_bdevs_discovered": 2, 00:26:57.371 "num_base_bdevs_operational": 2, 00:26:57.371 "base_bdevs_list": [ 00:26:57.371 { 00:26:57.371 "name": "BaseBdev1", 00:26:57.371 "uuid": "0d3c9922-696b-4d28-a1fd-0e2c44b7f475", 00:26:57.371 "is_configured": true, 00:26:57.371 "data_offset": 256, 00:26:57.371 "data_size": 7936 00:26:57.371 }, 00:26:57.371 { 00:26:57.371 "name": "BaseBdev2", 00:26:57.371 "uuid": "9c056747-dbab-4023-b680-05a3df307b85", 00:26:57.371 "is_configured": true, 00:26:57.371 "data_offset": 256, 00:26:57.371 "data_size": 7936 00:26:57.371 } 00:26:57.371 ] 00:26:57.371 }' 00:26:57.371 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.371 13:52:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:57.938 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:57.938 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:57.938 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:57.938 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:57.938 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:57.938 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:57.938 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:57.938 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:58.196 [2024-07-12 13:52:46.706459] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:58.196 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:58.196 "name": "Existed_Raid", 00:26:58.196 "aliases": [ 00:26:58.196 "ec4305a7-a9a2-4af6-bbd2-16cf68d546fd" 00:26:58.196 ], 00:26:58.196 "product_name": "Raid Volume", 00:26:58.196 "block_size": 4096, 00:26:58.196 "num_blocks": 7936, 00:26:58.196 "uuid": "ec4305a7-a9a2-4af6-bbd2-16cf68d546fd", 00:26:58.196 "assigned_rate_limits": { 00:26:58.196 "rw_ios_per_sec": 0, 00:26:58.196 "rw_mbytes_per_sec": 0, 00:26:58.196 "r_mbytes_per_sec": 0, 00:26:58.196 "w_mbytes_per_sec": 0 00:26:58.196 }, 00:26:58.196 "claimed": false, 00:26:58.196 "zoned": false, 00:26:58.196 "supported_io_types": { 00:26:58.196 "read": true, 00:26:58.196 "write": true, 00:26:58.196 "unmap": false, 00:26:58.196 "flush": false, 00:26:58.196 "reset": true, 00:26:58.196 "nvme_admin": false, 00:26:58.196 "nvme_io": false, 00:26:58.196 "nvme_io_md": false, 00:26:58.196 "write_zeroes": true, 00:26:58.196 "zcopy": false, 00:26:58.196 "get_zone_info": false, 00:26:58.196 "zone_management": false, 00:26:58.196 "zone_append": false, 00:26:58.196 "compare": false, 00:26:58.196 "compare_and_write": false, 00:26:58.196 "abort": false, 00:26:58.196 "seek_hole": false, 00:26:58.196 "seek_data": false, 00:26:58.196 "copy": false, 00:26:58.196 "nvme_iov_md": false 00:26:58.196 }, 00:26:58.196 "memory_domains": [ 00:26:58.196 { 00:26:58.196 "dma_device_id": "system", 00:26:58.196 "dma_device_type": 1 00:26:58.197 }, 00:26:58.197 { 00:26:58.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:58.197 "dma_device_type": 2 00:26:58.197 }, 00:26:58.197 { 00:26:58.197 "dma_device_id": "system", 00:26:58.197 "dma_device_type": 1 00:26:58.197 }, 00:26:58.197 { 00:26:58.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:58.197 "dma_device_type": 2 00:26:58.197 } 00:26:58.197 ], 00:26:58.197 "driver_specific": { 00:26:58.197 "raid": { 00:26:58.197 "uuid": "ec4305a7-a9a2-4af6-bbd2-16cf68d546fd", 00:26:58.197 "strip_size_kb": 0, 00:26:58.197 "state": "online", 00:26:58.197 "raid_level": "raid1", 00:26:58.197 "superblock": true, 00:26:58.197 "num_base_bdevs": 2, 00:26:58.197 "num_base_bdevs_discovered": 2, 00:26:58.197 "num_base_bdevs_operational": 2, 00:26:58.197 "base_bdevs_list": [ 00:26:58.197 { 00:26:58.197 "name": "BaseBdev1", 00:26:58.197 "uuid": "0d3c9922-696b-4d28-a1fd-0e2c44b7f475", 00:26:58.197 "is_configured": true, 00:26:58.197 "data_offset": 256, 00:26:58.197 "data_size": 7936 00:26:58.197 }, 00:26:58.197 { 00:26:58.197 "name": "BaseBdev2", 00:26:58.197 "uuid": "9c056747-dbab-4023-b680-05a3df307b85", 00:26:58.197 "is_configured": true, 00:26:58.197 "data_offset": 256, 00:26:58.197 "data_size": 7936 00:26:58.197 } 00:26:58.197 ] 00:26:58.197 } 00:26:58.197 } 00:26:58.197 }' 00:26:58.197 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:58.455 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:58.455 BaseBdev2' 00:26:58.455 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:58.455 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:58.455 13:52:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:58.713 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:58.713 "name": "BaseBdev1", 00:26:58.713 "aliases": [ 00:26:58.713 "0d3c9922-696b-4d28-a1fd-0e2c44b7f475" 00:26:58.713 ], 00:26:58.713 "product_name": "Malloc disk", 00:26:58.714 "block_size": 4096, 00:26:58.714 "num_blocks": 8192, 00:26:58.714 "uuid": "0d3c9922-696b-4d28-a1fd-0e2c44b7f475", 00:26:58.714 "assigned_rate_limits": { 00:26:58.714 "rw_ios_per_sec": 0, 00:26:58.714 "rw_mbytes_per_sec": 0, 00:26:58.714 "r_mbytes_per_sec": 0, 00:26:58.714 "w_mbytes_per_sec": 0 00:26:58.714 }, 00:26:58.714 "claimed": true, 00:26:58.714 "claim_type": "exclusive_write", 00:26:58.714 "zoned": false, 00:26:58.714 "supported_io_types": { 00:26:58.714 "read": true, 00:26:58.714 "write": true, 00:26:58.714 "unmap": true, 00:26:58.714 "flush": true, 00:26:58.714 "reset": true, 00:26:58.714 "nvme_admin": false, 00:26:58.714 "nvme_io": false, 00:26:58.714 "nvme_io_md": false, 00:26:58.714 "write_zeroes": true, 00:26:58.714 "zcopy": true, 00:26:58.714 "get_zone_info": false, 00:26:58.714 "zone_management": false, 00:26:58.714 "zone_append": false, 00:26:58.714 "compare": false, 00:26:58.714 "compare_and_write": false, 00:26:58.714 "abort": true, 00:26:58.714 "seek_hole": false, 00:26:58.714 "seek_data": false, 00:26:58.714 "copy": true, 00:26:58.714 "nvme_iov_md": false 00:26:58.714 }, 00:26:58.714 "memory_domains": [ 00:26:58.714 { 00:26:58.714 "dma_device_id": "system", 00:26:58.714 "dma_device_type": 1 00:26:58.714 }, 00:26:58.714 { 00:26:58.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:58.714 "dma_device_type": 2 00:26:58.714 } 00:26:58.714 ], 00:26:58.714 "driver_specific": {} 00:26:58.714 }' 00:26:58.714 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:58.714 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:58.714 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:58.714 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:58.714 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:58.714 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:58.714 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:58.714 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:58.972 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:58.972 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:58.972 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:58.972 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:58.972 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:58.972 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:58.972 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:59.230 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:59.230 "name": "BaseBdev2", 00:26:59.230 "aliases": [ 00:26:59.230 "9c056747-dbab-4023-b680-05a3df307b85" 00:26:59.230 ], 00:26:59.230 "product_name": "Malloc disk", 00:26:59.230 "block_size": 4096, 00:26:59.230 "num_blocks": 8192, 00:26:59.230 "uuid": "9c056747-dbab-4023-b680-05a3df307b85", 00:26:59.230 "assigned_rate_limits": { 00:26:59.230 "rw_ios_per_sec": 0, 00:26:59.230 "rw_mbytes_per_sec": 0, 00:26:59.230 "r_mbytes_per_sec": 0, 00:26:59.230 "w_mbytes_per_sec": 0 00:26:59.230 }, 00:26:59.230 "claimed": true, 00:26:59.230 "claim_type": "exclusive_write", 00:26:59.230 "zoned": false, 00:26:59.230 "supported_io_types": { 00:26:59.230 "read": true, 00:26:59.230 "write": true, 00:26:59.230 "unmap": true, 00:26:59.230 "flush": true, 00:26:59.230 "reset": true, 00:26:59.230 "nvme_admin": false, 00:26:59.230 "nvme_io": false, 00:26:59.230 "nvme_io_md": false, 00:26:59.230 "write_zeroes": true, 00:26:59.230 "zcopy": true, 00:26:59.230 "get_zone_info": false, 00:26:59.230 "zone_management": false, 00:26:59.230 "zone_append": false, 00:26:59.230 "compare": false, 00:26:59.230 "compare_and_write": false, 00:26:59.230 "abort": true, 00:26:59.230 "seek_hole": false, 00:26:59.230 "seek_data": false, 00:26:59.230 "copy": true, 00:26:59.230 "nvme_iov_md": false 00:26:59.230 }, 00:26:59.230 "memory_domains": [ 00:26:59.230 { 00:26:59.230 "dma_device_id": "system", 00:26:59.230 "dma_device_type": 1 00:26:59.230 }, 00:26:59.230 { 00:26:59.230 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:59.230 "dma_device_type": 2 00:26:59.230 } 00:26:59.230 ], 00:26:59.230 "driver_specific": {} 00:26:59.230 }' 00:26:59.230 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:59.230 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:59.230 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:59.230 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:59.230 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:59.488 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:59.488 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:59.488 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:59.488 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:59.488 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:59.488 13:52:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:59.488 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:59.489 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:59.747 [2024-07-12 13:52:48.170118] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.747 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:00.006 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:00.006 "name": "Existed_Raid", 00:27:00.006 "uuid": "ec4305a7-a9a2-4af6-bbd2-16cf68d546fd", 00:27:00.006 "strip_size_kb": 0, 00:27:00.006 "state": "online", 00:27:00.006 "raid_level": "raid1", 00:27:00.006 "superblock": true, 00:27:00.006 "num_base_bdevs": 2, 00:27:00.006 "num_base_bdevs_discovered": 1, 00:27:00.006 "num_base_bdevs_operational": 1, 00:27:00.006 "base_bdevs_list": [ 00:27:00.006 { 00:27:00.006 "name": null, 00:27:00.006 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:00.006 "is_configured": false, 00:27:00.006 "data_offset": 256, 00:27:00.006 "data_size": 7936 00:27:00.006 }, 00:27:00.006 { 00:27:00.006 "name": "BaseBdev2", 00:27:00.006 "uuid": "9c056747-dbab-4023-b680-05a3df307b85", 00:27:00.006 "is_configured": true, 00:27:00.006 "data_offset": 256, 00:27:00.006 "data_size": 7936 00:27:00.006 } 00:27:00.006 ] 00:27:00.006 }' 00:27:00.006 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:00.006 13:52:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:00.626 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:00.626 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:00.626 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.626 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:00.888 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:00.888 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:00.888 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:01.147 [2024-07-12 13:52:49.546773] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:01.147 [2024-07-12 13:52:49.546864] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:01.147 [2024-07-12 13:52:49.559618] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:01.147 [2024-07-12 13:52:49.559654] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:01.147 [2024-07-12 13:52:49.559665] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a53a10 name Existed_Raid, state offline 00:27:01.147 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:01.147 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:01.147 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:01.147 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 571622 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 571622 ']' 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 571622 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 571622 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 571622' 00:27:01.406 killing process with pid 571622 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 571622 00:27:01.406 [2024-07-12 13:52:49.877427] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:01.406 13:52:49 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 571622 00:27:01.406 [2024-07-12 13:52:49.878408] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:01.664 13:52:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:27:01.664 00:27:01.664 real 0m10.815s 00:27:01.664 user 0m19.184s 00:27:01.664 sys 0m2.036s 00:27:01.664 13:52:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:01.664 13:52:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:01.664 ************************************ 00:27:01.664 END TEST raid_state_function_test_sb_4k 00:27:01.664 ************************************ 00:27:01.664 13:52:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:01.664 13:52:50 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:27:01.664 13:52:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:01.664 13:52:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:01.664 13:52:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:01.664 ************************************ 00:27:01.664 START TEST raid_superblock_test_4k 00:27:01.664 ************************************ 00:27:01.664 13:52:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:01.664 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:01.664 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:01.664 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:01.664 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:01.664 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:01.664 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:01.664 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:01.664 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=573248 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 573248 /var/tmp/spdk-raid.sock 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 573248 ']' 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:01.665 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:01.665 13:52:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:01.924 [2024-07-12 13:52:50.260714] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:27:01.924 [2024-07-12 13:52:50.260786] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid573248 ] 00:27:01.924 [2024-07-12 13:52:50.389672] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:01.924 [2024-07-12 13:52:50.495891] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:02.182 [2024-07-12 13:52:50.563087] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:02.182 [2024-07-12 13:52:50.563125] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:02.747 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:27:03.004 malloc1 00:27:03.004 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:03.262 [2024-07-12 13:52:51.659143] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:03.262 [2024-07-12 13:52:51.659191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.262 [2024-07-12 13:52:51.659213] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fd9e90 00:27:03.262 [2024-07-12 13:52:51.659226] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.262 [2024-07-12 13:52:51.660789] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.262 [2024-07-12 13:52:51.660819] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:03.262 pt1 00:27:03.262 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:03.262 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:03.262 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:03.262 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:03.262 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:03.262 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:03.262 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:03.262 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:03.262 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:27:03.519 malloc2 00:27:03.519 13:52:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:03.777 [2024-07-12 13:52:52.177268] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:03.777 [2024-07-12 13:52:52.177318] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.777 [2024-07-12 13:52:52.177337] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2077fb0 00:27:03.777 [2024-07-12 13:52:52.177351] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.777 [2024-07-12 13:52:52.178948] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.777 [2024-07-12 13:52:52.178978] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:03.777 pt2 00:27:03.777 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:03.777 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:03.777 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:04.035 [2024-07-12 13:52:52.421942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:04.035 [2024-07-12 13:52:52.423321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:04.035 [2024-07-12 13:52:52.423477] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20786b0 00:27:04.035 [2024-07-12 13:52:52.423490] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:04.035 [2024-07-12 13:52:52.423690] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fdb220 00:27:04.035 [2024-07-12 13:52:52.423837] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20786b0 00:27:04.035 [2024-07-12 13:52:52.423847] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20786b0 00:27:04.035 [2024-07-12 13:52:52.423966] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.035 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.293 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:04.293 "name": "raid_bdev1", 00:27:04.293 "uuid": "10281c4a-c105-4fb1-8100-0b6d7722b61a", 00:27:04.293 "strip_size_kb": 0, 00:27:04.293 "state": "online", 00:27:04.293 "raid_level": "raid1", 00:27:04.293 "superblock": true, 00:27:04.293 "num_base_bdevs": 2, 00:27:04.293 "num_base_bdevs_discovered": 2, 00:27:04.293 "num_base_bdevs_operational": 2, 00:27:04.293 "base_bdevs_list": [ 00:27:04.293 { 00:27:04.293 "name": "pt1", 00:27:04.293 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:04.293 "is_configured": true, 00:27:04.293 "data_offset": 256, 00:27:04.293 "data_size": 7936 00:27:04.293 }, 00:27:04.293 { 00:27:04.293 "name": "pt2", 00:27:04.293 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:04.293 "is_configured": true, 00:27:04.293 "data_offset": 256, 00:27:04.293 "data_size": 7936 00:27:04.293 } 00:27:04.293 ] 00:27:04.293 }' 00:27:04.293 13:52:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:04.293 13:52:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:04.864 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:04.864 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:04.864 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:04.864 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:04.864 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:04.864 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:04.864 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:04.864 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:05.122 [2024-07-12 13:52:53.521072] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:05.122 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:05.122 "name": "raid_bdev1", 00:27:05.122 "aliases": [ 00:27:05.122 "10281c4a-c105-4fb1-8100-0b6d7722b61a" 00:27:05.122 ], 00:27:05.122 "product_name": "Raid Volume", 00:27:05.122 "block_size": 4096, 00:27:05.122 "num_blocks": 7936, 00:27:05.122 "uuid": "10281c4a-c105-4fb1-8100-0b6d7722b61a", 00:27:05.122 "assigned_rate_limits": { 00:27:05.122 "rw_ios_per_sec": 0, 00:27:05.122 "rw_mbytes_per_sec": 0, 00:27:05.122 "r_mbytes_per_sec": 0, 00:27:05.122 "w_mbytes_per_sec": 0 00:27:05.122 }, 00:27:05.122 "claimed": false, 00:27:05.122 "zoned": false, 00:27:05.122 "supported_io_types": { 00:27:05.122 "read": true, 00:27:05.122 "write": true, 00:27:05.122 "unmap": false, 00:27:05.122 "flush": false, 00:27:05.122 "reset": true, 00:27:05.122 "nvme_admin": false, 00:27:05.122 "nvme_io": false, 00:27:05.122 "nvme_io_md": false, 00:27:05.122 "write_zeroes": true, 00:27:05.122 "zcopy": false, 00:27:05.122 "get_zone_info": false, 00:27:05.122 "zone_management": false, 00:27:05.122 "zone_append": false, 00:27:05.122 "compare": false, 00:27:05.122 "compare_and_write": false, 00:27:05.122 "abort": false, 00:27:05.122 "seek_hole": false, 00:27:05.122 "seek_data": false, 00:27:05.122 "copy": false, 00:27:05.122 "nvme_iov_md": false 00:27:05.122 }, 00:27:05.122 "memory_domains": [ 00:27:05.122 { 00:27:05.122 "dma_device_id": "system", 00:27:05.122 "dma_device_type": 1 00:27:05.122 }, 00:27:05.122 { 00:27:05.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:05.122 "dma_device_type": 2 00:27:05.122 }, 00:27:05.122 { 00:27:05.122 "dma_device_id": "system", 00:27:05.122 "dma_device_type": 1 00:27:05.122 }, 00:27:05.122 { 00:27:05.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:05.122 "dma_device_type": 2 00:27:05.122 } 00:27:05.122 ], 00:27:05.122 "driver_specific": { 00:27:05.122 "raid": { 00:27:05.122 "uuid": "10281c4a-c105-4fb1-8100-0b6d7722b61a", 00:27:05.122 "strip_size_kb": 0, 00:27:05.122 "state": "online", 00:27:05.122 "raid_level": "raid1", 00:27:05.122 "superblock": true, 00:27:05.122 "num_base_bdevs": 2, 00:27:05.122 "num_base_bdevs_discovered": 2, 00:27:05.122 "num_base_bdevs_operational": 2, 00:27:05.122 "base_bdevs_list": [ 00:27:05.122 { 00:27:05.122 "name": "pt1", 00:27:05.122 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:05.122 "is_configured": true, 00:27:05.122 "data_offset": 256, 00:27:05.122 "data_size": 7936 00:27:05.122 }, 00:27:05.122 { 00:27:05.122 "name": "pt2", 00:27:05.122 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:05.122 "is_configured": true, 00:27:05.122 "data_offset": 256, 00:27:05.122 "data_size": 7936 00:27:05.122 } 00:27:05.122 ] 00:27:05.122 } 00:27:05.123 } 00:27:05.123 }' 00:27:05.123 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:05.123 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:05.123 pt2' 00:27:05.123 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:05.123 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:05.123 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:05.381 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:05.381 "name": "pt1", 00:27:05.381 "aliases": [ 00:27:05.381 "00000000-0000-0000-0000-000000000001" 00:27:05.381 ], 00:27:05.381 "product_name": "passthru", 00:27:05.381 "block_size": 4096, 00:27:05.381 "num_blocks": 8192, 00:27:05.381 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:05.381 "assigned_rate_limits": { 00:27:05.381 "rw_ios_per_sec": 0, 00:27:05.381 "rw_mbytes_per_sec": 0, 00:27:05.381 "r_mbytes_per_sec": 0, 00:27:05.381 "w_mbytes_per_sec": 0 00:27:05.381 }, 00:27:05.382 "claimed": true, 00:27:05.382 "claim_type": "exclusive_write", 00:27:05.382 "zoned": false, 00:27:05.382 "supported_io_types": { 00:27:05.382 "read": true, 00:27:05.382 "write": true, 00:27:05.382 "unmap": true, 00:27:05.382 "flush": true, 00:27:05.382 "reset": true, 00:27:05.382 "nvme_admin": false, 00:27:05.382 "nvme_io": false, 00:27:05.382 "nvme_io_md": false, 00:27:05.382 "write_zeroes": true, 00:27:05.382 "zcopy": true, 00:27:05.382 "get_zone_info": false, 00:27:05.382 "zone_management": false, 00:27:05.382 "zone_append": false, 00:27:05.382 "compare": false, 00:27:05.382 "compare_and_write": false, 00:27:05.382 "abort": true, 00:27:05.382 "seek_hole": false, 00:27:05.382 "seek_data": false, 00:27:05.382 "copy": true, 00:27:05.382 "nvme_iov_md": false 00:27:05.382 }, 00:27:05.382 "memory_domains": [ 00:27:05.382 { 00:27:05.382 "dma_device_id": "system", 00:27:05.382 "dma_device_type": 1 00:27:05.382 }, 00:27:05.382 { 00:27:05.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:05.382 "dma_device_type": 2 00:27:05.382 } 00:27:05.382 ], 00:27:05.382 "driver_specific": { 00:27:05.382 "passthru": { 00:27:05.382 "name": "pt1", 00:27:05.382 "base_bdev_name": "malloc1" 00:27:05.382 } 00:27:05.382 } 00:27:05.382 }' 00:27:05.382 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:05.382 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:05.382 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:05.382 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:05.641 13:52:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:05.641 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:05.641 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:05.641 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:05.641 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:05.641 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:05.641 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:05.641 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:05.641 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:05.641 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:05.641 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:05.900 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:05.900 "name": "pt2", 00:27:05.900 "aliases": [ 00:27:05.900 "00000000-0000-0000-0000-000000000002" 00:27:05.900 ], 00:27:05.900 "product_name": "passthru", 00:27:05.900 "block_size": 4096, 00:27:05.900 "num_blocks": 8192, 00:27:05.900 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:05.900 "assigned_rate_limits": { 00:27:05.900 "rw_ios_per_sec": 0, 00:27:05.900 "rw_mbytes_per_sec": 0, 00:27:05.900 "r_mbytes_per_sec": 0, 00:27:05.900 "w_mbytes_per_sec": 0 00:27:05.900 }, 00:27:05.900 "claimed": true, 00:27:05.900 "claim_type": "exclusive_write", 00:27:05.900 "zoned": false, 00:27:05.900 "supported_io_types": { 00:27:05.900 "read": true, 00:27:05.900 "write": true, 00:27:05.900 "unmap": true, 00:27:05.900 "flush": true, 00:27:05.900 "reset": true, 00:27:05.900 "nvme_admin": false, 00:27:05.900 "nvme_io": false, 00:27:05.900 "nvme_io_md": false, 00:27:05.900 "write_zeroes": true, 00:27:05.900 "zcopy": true, 00:27:05.900 "get_zone_info": false, 00:27:05.900 "zone_management": false, 00:27:05.900 "zone_append": false, 00:27:05.900 "compare": false, 00:27:05.900 "compare_and_write": false, 00:27:05.900 "abort": true, 00:27:05.900 "seek_hole": false, 00:27:05.900 "seek_data": false, 00:27:05.900 "copy": true, 00:27:05.900 "nvme_iov_md": false 00:27:05.900 }, 00:27:05.900 "memory_domains": [ 00:27:05.900 { 00:27:05.900 "dma_device_id": "system", 00:27:05.900 "dma_device_type": 1 00:27:05.900 }, 00:27:05.900 { 00:27:05.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:05.900 "dma_device_type": 2 00:27:05.900 } 00:27:05.900 ], 00:27:05.900 "driver_specific": { 00:27:05.900 "passthru": { 00:27:05.900 "name": "pt2", 00:27:05.900 "base_bdev_name": "malloc2" 00:27:05.900 } 00:27:05.900 } 00:27:05.900 }' 00:27:05.900 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:05.900 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.159 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:06.159 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.159 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.159 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:06.159 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:06.159 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:06.159 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:06.159 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:06.159 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:06.419 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:06.419 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:06.419 13:52:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:06.419 [2024-07-12 13:52:54.996980] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:06.679 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=10281c4a-c105-4fb1-8100-0b6d7722b61a 00:27:06.679 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 10281c4a-c105-4fb1-8100-0b6d7722b61a ']' 00:27:06.679 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:06.679 [2024-07-12 13:52:55.241385] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:06.679 [2024-07-12 13:52:55.241410] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:06.679 [2024-07-12 13:52:55.241463] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:06.679 [2024-07-12 13:52:55.241516] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:06.679 [2024-07-12 13:52:55.241528] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20786b0 name raid_bdev1, state offline 00:27:06.939 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.939 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:06.939 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:06.939 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:06.939 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:06.939 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:07.198 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:07.198 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:07.458 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:07.458 13:52:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:07.717 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:07.977 [2024-07-12 13:52:56.468577] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:07.977 [2024-07-12 13:52:56.469978] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:07.977 [2024-07-12 13:52:56.470033] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:07.977 [2024-07-12 13:52:56.470076] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:07.977 [2024-07-12 13:52:56.470095] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:07.977 [2024-07-12 13:52:56.470105] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fdc8d0 name raid_bdev1, state configuring 00:27:07.977 request: 00:27:07.977 { 00:27:07.977 "name": "raid_bdev1", 00:27:07.977 "raid_level": "raid1", 00:27:07.977 "base_bdevs": [ 00:27:07.977 "malloc1", 00:27:07.977 "malloc2" 00:27:07.977 ], 00:27:07.977 "superblock": false, 00:27:07.977 "method": "bdev_raid_create", 00:27:07.977 "req_id": 1 00:27:07.977 } 00:27:07.977 Got JSON-RPC error response 00:27:07.977 response: 00:27:07.977 { 00:27:07.977 "code": -17, 00:27:07.977 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:07.977 } 00:27:07.977 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:27:07.977 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:07.977 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:07.977 13:52:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:07.978 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.978 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:08.237 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:08.237 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:08.237 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:08.496 [2024-07-12 13:52:56.965819] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:08.496 [2024-07-12 13:52:56.965866] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:08.496 [2024-07-12 13:52:56.965886] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fdd040 00:27:08.497 [2024-07-12 13:52:56.965909] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:08.497 [2024-07-12 13:52:56.967568] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:08.497 [2024-07-12 13:52:56.967598] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:08.497 [2024-07-12 13:52:56.967666] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:08.497 [2024-07-12 13:52:56.967694] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:08.497 pt1 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.497 13:52:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:08.756 13:52:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:08.756 "name": "raid_bdev1", 00:27:08.756 "uuid": "10281c4a-c105-4fb1-8100-0b6d7722b61a", 00:27:08.756 "strip_size_kb": 0, 00:27:08.756 "state": "configuring", 00:27:08.756 "raid_level": "raid1", 00:27:08.756 "superblock": true, 00:27:08.756 "num_base_bdevs": 2, 00:27:08.756 "num_base_bdevs_discovered": 1, 00:27:08.756 "num_base_bdevs_operational": 2, 00:27:08.756 "base_bdevs_list": [ 00:27:08.756 { 00:27:08.756 "name": "pt1", 00:27:08.756 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:08.756 "is_configured": true, 00:27:08.756 "data_offset": 256, 00:27:08.756 "data_size": 7936 00:27:08.756 }, 00:27:08.756 { 00:27:08.756 "name": null, 00:27:08.756 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:08.756 "is_configured": false, 00:27:08.756 "data_offset": 256, 00:27:08.756 "data_size": 7936 00:27:08.756 } 00:27:08.756 ] 00:27:08.756 }' 00:27:08.756 13:52:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:08.756 13:52:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:09.325 13:52:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:09.325 13:52:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:09.325 13:52:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:09.325 13:52:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:09.584 [2024-07-12 13:52:57.980526] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:09.584 [2024-07-12 13:52:57.980573] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:09.584 [2024-07-12 13:52:57.980594] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fdbbf0 00:27:09.584 [2024-07-12 13:52:57.980608] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:09.584 [2024-07-12 13:52:57.980948] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:09.584 [2024-07-12 13:52:57.980968] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:09.584 [2024-07-12 13:52:57.981032] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:09.584 [2024-07-12 13:52:57.981058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:09.584 [2024-07-12 13:52:57.981158] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fd8970 00:27:09.584 [2024-07-12 13:52:57.981169] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:09.584 [2024-07-12 13:52:57.981338] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207d690 00:27:09.584 [2024-07-12 13:52:57.981464] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fd8970 00:27:09.584 [2024-07-12 13:52:57.981474] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fd8970 00:27:09.584 [2024-07-12 13:52:57.981573] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:09.584 pt2 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.584 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.843 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:09.843 "name": "raid_bdev1", 00:27:09.843 "uuid": "10281c4a-c105-4fb1-8100-0b6d7722b61a", 00:27:09.843 "strip_size_kb": 0, 00:27:09.843 "state": "online", 00:27:09.843 "raid_level": "raid1", 00:27:09.843 "superblock": true, 00:27:09.843 "num_base_bdevs": 2, 00:27:09.843 "num_base_bdevs_discovered": 2, 00:27:09.843 "num_base_bdevs_operational": 2, 00:27:09.843 "base_bdevs_list": [ 00:27:09.843 { 00:27:09.843 "name": "pt1", 00:27:09.843 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:09.843 "is_configured": true, 00:27:09.843 "data_offset": 256, 00:27:09.843 "data_size": 7936 00:27:09.843 }, 00:27:09.843 { 00:27:09.843 "name": "pt2", 00:27:09.843 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:09.843 "is_configured": true, 00:27:09.843 "data_offset": 256, 00:27:09.843 "data_size": 7936 00:27:09.843 } 00:27:09.843 ] 00:27:09.843 }' 00:27:09.843 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:09.843 13:52:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:10.411 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:10.411 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:10.411 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:10.411 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:10.411 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:10.411 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:10.411 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:10.411 13:52:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:10.411 [2024-07-12 13:52:58.987439] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:10.670 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:10.670 "name": "raid_bdev1", 00:27:10.670 "aliases": [ 00:27:10.670 "10281c4a-c105-4fb1-8100-0b6d7722b61a" 00:27:10.670 ], 00:27:10.670 "product_name": "Raid Volume", 00:27:10.670 "block_size": 4096, 00:27:10.670 "num_blocks": 7936, 00:27:10.670 "uuid": "10281c4a-c105-4fb1-8100-0b6d7722b61a", 00:27:10.670 "assigned_rate_limits": { 00:27:10.670 "rw_ios_per_sec": 0, 00:27:10.670 "rw_mbytes_per_sec": 0, 00:27:10.670 "r_mbytes_per_sec": 0, 00:27:10.670 "w_mbytes_per_sec": 0 00:27:10.670 }, 00:27:10.670 "claimed": false, 00:27:10.670 "zoned": false, 00:27:10.670 "supported_io_types": { 00:27:10.670 "read": true, 00:27:10.670 "write": true, 00:27:10.670 "unmap": false, 00:27:10.670 "flush": false, 00:27:10.670 "reset": true, 00:27:10.670 "nvme_admin": false, 00:27:10.670 "nvme_io": false, 00:27:10.670 "nvme_io_md": false, 00:27:10.670 "write_zeroes": true, 00:27:10.670 "zcopy": false, 00:27:10.670 "get_zone_info": false, 00:27:10.670 "zone_management": false, 00:27:10.670 "zone_append": false, 00:27:10.670 "compare": false, 00:27:10.670 "compare_and_write": false, 00:27:10.670 "abort": false, 00:27:10.670 "seek_hole": false, 00:27:10.670 "seek_data": false, 00:27:10.670 "copy": false, 00:27:10.670 "nvme_iov_md": false 00:27:10.670 }, 00:27:10.670 "memory_domains": [ 00:27:10.670 { 00:27:10.670 "dma_device_id": "system", 00:27:10.670 "dma_device_type": 1 00:27:10.670 }, 00:27:10.670 { 00:27:10.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:10.670 "dma_device_type": 2 00:27:10.671 }, 00:27:10.671 { 00:27:10.671 "dma_device_id": "system", 00:27:10.671 "dma_device_type": 1 00:27:10.671 }, 00:27:10.671 { 00:27:10.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:10.671 "dma_device_type": 2 00:27:10.671 } 00:27:10.671 ], 00:27:10.671 "driver_specific": { 00:27:10.671 "raid": { 00:27:10.671 "uuid": "10281c4a-c105-4fb1-8100-0b6d7722b61a", 00:27:10.671 "strip_size_kb": 0, 00:27:10.671 "state": "online", 00:27:10.671 "raid_level": "raid1", 00:27:10.671 "superblock": true, 00:27:10.671 "num_base_bdevs": 2, 00:27:10.671 "num_base_bdevs_discovered": 2, 00:27:10.671 "num_base_bdevs_operational": 2, 00:27:10.671 "base_bdevs_list": [ 00:27:10.671 { 00:27:10.671 "name": "pt1", 00:27:10.671 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:10.671 "is_configured": true, 00:27:10.671 "data_offset": 256, 00:27:10.671 "data_size": 7936 00:27:10.671 }, 00:27:10.671 { 00:27:10.671 "name": "pt2", 00:27:10.671 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:10.671 "is_configured": true, 00:27:10.671 "data_offset": 256, 00:27:10.671 "data_size": 7936 00:27:10.671 } 00:27:10.671 ] 00:27:10.671 } 00:27:10.671 } 00:27:10.671 }' 00:27:10.671 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:10.671 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:10.671 pt2' 00:27:10.671 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:10.671 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:10.671 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:10.930 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:10.930 "name": "pt1", 00:27:10.930 "aliases": [ 00:27:10.930 "00000000-0000-0000-0000-000000000001" 00:27:10.930 ], 00:27:10.930 "product_name": "passthru", 00:27:10.930 "block_size": 4096, 00:27:10.930 "num_blocks": 8192, 00:27:10.930 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:10.930 "assigned_rate_limits": { 00:27:10.930 "rw_ios_per_sec": 0, 00:27:10.930 "rw_mbytes_per_sec": 0, 00:27:10.930 "r_mbytes_per_sec": 0, 00:27:10.930 "w_mbytes_per_sec": 0 00:27:10.930 }, 00:27:10.930 "claimed": true, 00:27:10.930 "claim_type": "exclusive_write", 00:27:10.930 "zoned": false, 00:27:10.930 "supported_io_types": { 00:27:10.930 "read": true, 00:27:10.930 "write": true, 00:27:10.930 "unmap": true, 00:27:10.930 "flush": true, 00:27:10.930 "reset": true, 00:27:10.930 "nvme_admin": false, 00:27:10.930 "nvme_io": false, 00:27:10.930 "nvme_io_md": false, 00:27:10.930 "write_zeroes": true, 00:27:10.930 "zcopy": true, 00:27:10.930 "get_zone_info": false, 00:27:10.930 "zone_management": false, 00:27:10.930 "zone_append": false, 00:27:10.930 "compare": false, 00:27:10.930 "compare_and_write": false, 00:27:10.930 "abort": true, 00:27:10.930 "seek_hole": false, 00:27:10.930 "seek_data": false, 00:27:10.930 "copy": true, 00:27:10.930 "nvme_iov_md": false 00:27:10.930 }, 00:27:10.930 "memory_domains": [ 00:27:10.930 { 00:27:10.930 "dma_device_id": "system", 00:27:10.930 "dma_device_type": 1 00:27:10.930 }, 00:27:10.930 { 00:27:10.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:10.930 "dma_device_type": 2 00:27:10.930 } 00:27:10.930 ], 00:27:10.930 "driver_specific": { 00:27:10.930 "passthru": { 00:27:10.930 "name": "pt1", 00:27:10.930 "base_bdev_name": "malloc1" 00:27:10.930 } 00:27:10.930 } 00:27:10.930 }' 00:27:10.930 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:10.930 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:10.930 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:10.930 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:10.930 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:10.930 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:10.930 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:11.189 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:11.189 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:11.189 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:11.189 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:11.189 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:11.189 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:11.189 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:11.189 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:11.449 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:11.449 "name": "pt2", 00:27:11.449 "aliases": [ 00:27:11.449 "00000000-0000-0000-0000-000000000002" 00:27:11.449 ], 00:27:11.449 "product_name": "passthru", 00:27:11.449 "block_size": 4096, 00:27:11.449 "num_blocks": 8192, 00:27:11.449 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:11.449 "assigned_rate_limits": { 00:27:11.449 "rw_ios_per_sec": 0, 00:27:11.449 "rw_mbytes_per_sec": 0, 00:27:11.449 "r_mbytes_per_sec": 0, 00:27:11.449 "w_mbytes_per_sec": 0 00:27:11.449 }, 00:27:11.449 "claimed": true, 00:27:11.449 "claim_type": "exclusive_write", 00:27:11.449 "zoned": false, 00:27:11.449 "supported_io_types": { 00:27:11.449 "read": true, 00:27:11.449 "write": true, 00:27:11.449 "unmap": true, 00:27:11.449 "flush": true, 00:27:11.449 "reset": true, 00:27:11.449 "nvme_admin": false, 00:27:11.449 "nvme_io": false, 00:27:11.449 "nvme_io_md": false, 00:27:11.449 "write_zeroes": true, 00:27:11.449 "zcopy": true, 00:27:11.449 "get_zone_info": false, 00:27:11.449 "zone_management": false, 00:27:11.449 "zone_append": false, 00:27:11.449 "compare": false, 00:27:11.449 "compare_and_write": false, 00:27:11.449 "abort": true, 00:27:11.449 "seek_hole": false, 00:27:11.449 "seek_data": false, 00:27:11.449 "copy": true, 00:27:11.449 "nvme_iov_md": false 00:27:11.449 }, 00:27:11.449 "memory_domains": [ 00:27:11.449 { 00:27:11.449 "dma_device_id": "system", 00:27:11.449 "dma_device_type": 1 00:27:11.449 }, 00:27:11.449 { 00:27:11.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.449 "dma_device_type": 2 00:27:11.449 } 00:27:11.449 ], 00:27:11.449 "driver_specific": { 00:27:11.449 "passthru": { 00:27:11.449 "name": "pt2", 00:27:11.449 "base_bdev_name": "malloc2" 00:27:11.449 } 00:27:11.449 } 00:27:11.449 }' 00:27:11.449 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.449 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.449 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:11.449 13:52:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:11.707 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:11.707 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:11.707 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:11.707 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:11.707 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:11.707 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:11.707 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:11.707 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:11.708 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:11.708 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:11.967 [2024-07-12 13:53:00.415249] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:11.967 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 10281c4a-c105-4fb1-8100-0b6d7722b61a '!=' 10281c4a-c105-4fb1-8100-0b6d7722b61a ']' 00:27:11.967 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:11.967 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:11.967 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:11.967 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:12.227 [2024-07-12 13:53:00.667679] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.227 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.486 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:12.486 "name": "raid_bdev1", 00:27:12.486 "uuid": "10281c4a-c105-4fb1-8100-0b6d7722b61a", 00:27:12.486 "strip_size_kb": 0, 00:27:12.486 "state": "online", 00:27:12.486 "raid_level": "raid1", 00:27:12.486 "superblock": true, 00:27:12.486 "num_base_bdevs": 2, 00:27:12.486 "num_base_bdevs_discovered": 1, 00:27:12.486 "num_base_bdevs_operational": 1, 00:27:12.486 "base_bdevs_list": [ 00:27:12.486 { 00:27:12.486 "name": null, 00:27:12.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.486 "is_configured": false, 00:27:12.486 "data_offset": 256, 00:27:12.486 "data_size": 7936 00:27:12.486 }, 00:27:12.486 { 00:27:12.486 "name": "pt2", 00:27:12.486 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:12.486 "is_configured": true, 00:27:12.486 "data_offset": 256, 00:27:12.486 "data_size": 7936 00:27:12.486 } 00:27:12.486 ] 00:27:12.486 }' 00:27:12.486 13:53:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:12.486 13:53:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:13.051 13:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:13.309 [2024-07-12 13:53:01.762553] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:13.309 [2024-07-12 13:53:01.762578] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:13.309 [2024-07-12 13:53:01.762630] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:13.309 [2024-07-12 13:53:01.762672] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:13.309 [2024-07-12 13:53:01.762683] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fd8970 name raid_bdev1, state offline 00:27:13.309 13:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.309 13:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:13.568 13:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:13.568 13:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:13.568 13:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:13.568 13:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:13.568 13:53:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:13.827 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:13.827 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:13.827 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:13.827 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:13.827 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:27:13.827 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:13.827 [2024-07-12 13:53:02.384171] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:13.827 [2024-07-12 13:53:02.384214] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:13.827 [2024-07-12 13:53:02.384233] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x207cf00 00:27:13.827 [2024-07-12 13:53:02.384246] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:13.827 [2024-07-12 13:53:02.385839] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:13.827 [2024-07-12 13:53:02.385869] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:13.827 [2024-07-12 13:53:02.385946] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:13.827 [2024-07-12 13:53:02.385972] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:13.827 [2024-07-12 13:53:02.386054] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x207a970 00:27:13.827 [2024-07-12 13:53:02.386065] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:13.827 [2024-07-12 13:53:02.386234] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207a240 00:27:13.827 [2024-07-12 13:53:02.386350] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x207a970 00:27:13.827 [2024-07-12 13:53:02.386360] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x207a970 00:27:13.827 [2024-07-12 13:53:02.386454] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:13.827 pt2 00:27:13.827 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:14.086 "name": "raid_bdev1", 00:27:14.086 "uuid": "10281c4a-c105-4fb1-8100-0b6d7722b61a", 00:27:14.086 "strip_size_kb": 0, 00:27:14.086 "state": "online", 00:27:14.086 "raid_level": "raid1", 00:27:14.086 "superblock": true, 00:27:14.086 "num_base_bdevs": 2, 00:27:14.086 "num_base_bdevs_discovered": 1, 00:27:14.086 "num_base_bdevs_operational": 1, 00:27:14.086 "base_bdevs_list": [ 00:27:14.086 { 00:27:14.086 "name": null, 00:27:14.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:14.086 "is_configured": false, 00:27:14.086 "data_offset": 256, 00:27:14.086 "data_size": 7936 00:27:14.086 }, 00:27:14.086 { 00:27:14.086 "name": "pt2", 00:27:14.086 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:14.086 "is_configured": true, 00:27:14.086 "data_offset": 256, 00:27:14.086 "data_size": 7936 00:27:14.086 } 00:27:14.086 ] 00:27:14.086 }' 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:14.086 13:53:02 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:15.031 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:15.031 [2024-07-12 13:53:03.467030] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:15.031 [2024-07-12 13:53:03.467056] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:15.032 [2024-07-12 13:53:03.467103] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:15.032 [2024-07-12 13:53:03.467147] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:15.032 [2024-07-12 13:53:03.467159] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x207a970 name raid_bdev1, state offline 00:27:15.032 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.032 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:15.290 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:15.290 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:15.290 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:15.290 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:15.548 [2024-07-12 13:53:03.972349] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:15.548 [2024-07-12 13:53:03.972393] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:15.548 [2024-07-12 13:53:03.972413] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fd9230 00:27:15.548 [2024-07-12 13:53:03.972426] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:15.548 [2024-07-12 13:53:03.974035] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:15.548 [2024-07-12 13:53:03.974064] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:15.548 [2024-07-12 13:53:03.974128] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:15.548 [2024-07-12 13:53:03.974154] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:15.548 [2024-07-12 13:53:03.974251] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:15.548 [2024-07-12 13:53:03.974264] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:15.549 [2024-07-12 13:53:03.974278] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x207b5c0 name raid_bdev1, state configuring 00:27:15.549 [2024-07-12 13:53:03.974301] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:15.549 [2024-07-12 13:53:03.974355] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x207d130 00:27:15.549 [2024-07-12 13:53:03.974365] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:15.549 [2024-07-12 13:53:03.974528] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207b650 00:27:15.549 [2024-07-12 13:53:03.974652] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x207d130 00:27:15.549 [2024-07-12 13:53:03.974662] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x207d130 00:27:15.549 [2024-07-12 13:53:03.974756] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:15.549 pt1 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.549 13:53:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.549 13:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.549 13:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.807 13:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:15.808 "name": "raid_bdev1", 00:27:15.808 "uuid": "10281c4a-c105-4fb1-8100-0b6d7722b61a", 00:27:15.808 "strip_size_kb": 0, 00:27:15.808 "state": "online", 00:27:15.808 "raid_level": "raid1", 00:27:15.808 "superblock": true, 00:27:15.808 "num_base_bdevs": 2, 00:27:15.808 "num_base_bdevs_discovered": 1, 00:27:15.808 "num_base_bdevs_operational": 1, 00:27:15.808 "base_bdevs_list": [ 00:27:15.808 { 00:27:15.808 "name": null, 00:27:15.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.808 "is_configured": false, 00:27:15.808 "data_offset": 256, 00:27:15.808 "data_size": 7936 00:27:15.808 }, 00:27:15.808 { 00:27:15.808 "name": "pt2", 00:27:15.808 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:15.808 "is_configured": true, 00:27:15.808 "data_offset": 256, 00:27:15.808 "data_size": 7936 00:27:15.808 } 00:27:15.808 ] 00:27:15.808 }' 00:27:15.808 13:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:15.808 13:53:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:16.374 13:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:16.374 13:53:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:16.632 13:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:16.632 13:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:16.632 13:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:16.892 [2024-07-12 13:53:05.336202] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 10281c4a-c105-4fb1-8100-0b6d7722b61a '!=' 10281c4a-c105-4fb1-8100-0b6d7722b61a ']' 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 573248 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 573248 ']' 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 573248 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 573248 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 573248' 00:27:16.892 killing process with pid 573248 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 573248 00:27:16.892 [2024-07-12 13:53:05.406489] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:16.892 [2024-07-12 13:53:05.406538] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:16.892 [2024-07-12 13:53:05.406579] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:16.892 [2024-07-12 13:53:05.406590] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x207d130 name raid_bdev1, state offline 00:27:16.892 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 573248 00:27:16.892 [2024-07-12 13:53:05.422983] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:17.152 13:53:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:27:17.152 00:27:17.152 real 0m15.433s 00:27:17.152 user 0m27.962s 00:27:17.152 sys 0m2.860s 00:27:17.152 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:17.152 13:53:05 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:17.152 ************************************ 00:27:17.152 END TEST raid_superblock_test_4k 00:27:17.152 ************************************ 00:27:17.152 13:53:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:17.152 13:53:05 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:27:17.152 13:53:05 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:27:17.152 13:53:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:17.152 13:53:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:17.152 13:53:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:17.152 ************************************ 00:27:17.152 START TEST raid_rebuild_test_sb_4k 00:27:17.152 ************************************ 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=575493 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 575493 /var/tmp/spdk-raid.sock 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 575493 ']' 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:17.152 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:17.152 13:53:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:17.412 [2024-07-12 13:53:05.781207] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:27:17.412 [2024-07-12 13:53:05.781275] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid575493 ] 00:27:17.412 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:17.412 Zero copy mechanism will not be used. 00:27:17.412 [2024-07-12 13:53:05.910961] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:17.671 [2024-07-12 13:53:06.013963] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:17.671 [2024-07-12 13:53:06.076579] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:17.671 [2024-07-12 13:53:06.076626] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:18.240 13:53:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:18.240 13:53:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:27:18.240 13:53:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:18.240 13:53:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:27:18.499 BaseBdev1_malloc 00:27:18.499 13:53:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:18.758 [2024-07-12 13:53:07.193370] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:18.758 [2024-07-12 13:53:07.193422] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:18.758 [2024-07-12 13:53:07.193448] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15aa680 00:27:18.758 [2024-07-12 13:53:07.193462] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:18.758 [2024-07-12 13:53:07.195242] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:18.758 [2024-07-12 13:53:07.195274] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:18.758 BaseBdev1 00:27:18.758 13:53:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:18.758 13:53:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:27:19.017 BaseBdev2_malloc 00:27:19.017 13:53:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:19.276 [2024-07-12 13:53:07.687586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:19.276 [2024-07-12 13:53:07.687633] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:19.276 [2024-07-12 13:53:07.687656] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15ab1a0 00:27:19.277 [2024-07-12 13:53:07.687669] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:19.277 [2024-07-12 13:53:07.689234] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:19.277 [2024-07-12 13:53:07.689262] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:19.277 BaseBdev2 00:27:19.277 13:53:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:27:19.536 spare_malloc 00:27:19.536 13:53:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:19.795 spare_delay 00:27:19.795 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:20.056 [2024-07-12 13:53:08.410101] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:20.056 [2024-07-12 13:53:08.410148] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:20.056 [2024-07-12 13:53:08.410169] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1759800 00:27:20.056 [2024-07-12 13:53:08.410182] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:20.056 [2024-07-12 13:53:08.411768] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:20.056 [2024-07-12 13:53:08.411797] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:20.056 spare 00:27:20.056 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:20.315 [2024-07-12 13:53:08.658788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:20.315 [2024-07-12 13:53:08.660185] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:20.315 [2024-07-12 13:53:08.660356] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x175a9b0 00:27:20.315 [2024-07-12 13:53:08.660370] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:20.315 [2024-07-12 13:53:08.660573] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1753dd0 00:27:20.315 [2024-07-12 13:53:08.660717] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x175a9b0 00:27:20.315 [2024-07-12 13:53:08.660727] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x175a9b0 00:27:20.315 [2024-07-12 13:53:08.660828] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.315 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.575 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:20.575 "name": "raid_bdev1", 00:27:20.575 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:20.575 "strip_size_kb": 0, 00:27:20.575 "state": "online", 00:27:20.575 "raid_level": "raid1", 00:27:20.575 "superblock": true, 00:27:20.575 "num_base_bdevs": 2, 00:27:20.575 "num_base_bdevs_discovered": 2, 00:27:20.575 "num_base_bdevs_operational": 2, 00:27:20.575 "base_bdevs_list": [ 00:27:20.575 { 00:27:20.575 "name": "BaseBdev1", 00:27:20.575 "uuid": "7fec36fc-47ae-5cb8-a53f-b3ab926a5210", 00:27:20.575 "is_configured": true, 00:27:20.575 "data_offset": 256, 00:27:20.575 "data_size": 7936 00:27:20.575 }, 00:27:20.575 { 00:27:20.575 "name": "BaseBdev2", 00:27:20.575 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:20.575 "is_configured": true, 00:27:20.575 "data_offset": 256, 00:27:20.575 "data_size": 7936 00:27:20.575 } 00:27:20.575 ] 00:27:20.575 }' 00:27:20.575 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:20.575 13:53:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:21.143 13:53:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:21.143 13:53:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:21.401 [2024-07-12 13:53:09.777991] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:21.401 13:53:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:21.401 13:53:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.401 13:53:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:21.660 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:21.917 [2024-07-12 13:53:10.343287] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1753dd0 00:27:21.917 /dev/nbd0 00:27:21.917 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:21.917 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:21.917 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:21.917 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:21.917 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:21.917 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:21.917 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:21.918 1+0 records in 00:27:21.918 1+0 records out 00:27:21.918 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238732 s, 17.2 MB/s 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:21.918 13:53:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:22.848 7936+0 records in 00:27:22.848 7936+0 records out 00:27:22.848 32505856 bytes (33 MB, 31 MiB) copied, 0.768823 s, 42.3 MB/s 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:22.848 [2024-07-12 13:53:11.379458] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:22.848 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:23.175 [2024-07-12 13:53:11.551973] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.175 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.464 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.464 "name": "raid_bdev1", 00:27:23.464 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:23.464 "strip_size_kb": 0, 00:27:23.464 "state": "online", 00:27:23.464 "raid_level": "raid1", 00:27:23.464 "superblock": true, 00:27:23.464 "num_base_bdevs": 2, 00:27:23.464 "num_base_bdevs_discovered": 1, 00:27:23.464 "num_base_bdevs_operational": 1, 00:27:23.464 "base_bdevs_list": [ 00:27:23.464 { 00:27:23.464 "name": null, 00:27:23.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.464 "is_configured": false, 00:27:23.464 "data_offset": 256, 00:27:23.464 "data_size": 7936 00:27:23.464 }, 00:27:23.464 { 00:27:23.464 "name": "BaseBdev2", 00:27:23.464 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:23.464 "is_configured": true, 00:27:23.464 "data_offset": 256, 00:27:23.464 "data_size": 7936 00:27:23.464 } 00:27:23.464 ] 00:27:23.464 }' 00:27:23.464 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.464 13:53:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:24.030 13:53:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:24.030 [2024-07-12 13:53:12.586714] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:24.030 [2024-07-12 13:53:12.591694] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x175a620 00:27:24.030 [2024-07-12 13:53:12.593895] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:24.030 13:53:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:25.408 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:25.408 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:25.408 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:25.408 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:25.408 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:25.408 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.408 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:25.408 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:25.408 "name": "raid_bdev1", 00:27:25.408 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:25.408 "strip_size_kb": 0, 00:27:25.408 "state": "online", 00:27:25.408 "raid_level": "raid1", 00:27:25.408 "superblock": true, 00:27:25.408 "num_base_bdevs": 2, 00:27:25.408 "num_base_bdevs_discovered": 2, 00:27:25.408 "num_base_bdevs_operational": 2, 00:27:25.408 "process": { 00:27:25.408 "type": "rebuild", 00:27:25.408 "target": "spare", 00:27:25.408 "progress": { 00:27:25.408 "blocks": 3072, 00:27:25.408 "percent": 38 00:27:25.408 } 00:27:25.408 }, 00:27:25.408 "base_bdevs_list": [ 00:27:25.408 { 00:27:25.408 "name": "spare", 00:27:25.408 "uuid": "ccb8df24-b3a6-5ff6-afa3-5f83bb9b642d", 00:27:25.408 "is_configured": true, 00:27:25.408 "data_offset": 256, 00:27:25.408 "data_size": 7936 00:27:25.408 }, 00:27:25.408 { 00:27:25.408 "name": "BaseBdev2", 00:27:25.408 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:25.409 "is_configured": true, 00:27:25.409 "data_offset": 256, 00:27:25.409 "data_size": 7936 00:27:25.409 } 00:27:25.409 ] 00:27:25.409 }' 00:27:25.409 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:25.409 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:25.409 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:25.668 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:25.668 13:53:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:25.668 [2024-07-12 13:53:14.217678] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:25.927 [2024-07-12 13:53:14.307427] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:25.927 [2024-07-12 13:53:14.307474] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:25.927 [2024-07-12 13:53:14.307491] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:25.927 [2024-07-12 13:53:14.307499] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.927 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.187 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.187 "name": "raid_bdev1", 00:27:26.187 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:26.187 "strip_size_kb": 0, 00:27:26.187 "state": "online", 00:27:26.187 "raid_level": "raid1", 00:27:26.187 "superblock": true, 00:27:26.187 "num_base_bdevs": 2, 00:27:26.187 "num_base_bdevs_discovered": 1, 00:27:26.187 "num_base_bdevs_operational": 1, 00:27:26.187 "base_bdevs_list": [ 00:27:26.187 { 00:27:26.187 "name": null, 00:27:26.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.187 "is_configured": false, 00:27:26.187 "data_offset": 256, 00:27:26.187 "data_size": 7936 00:27:26.187 }, 00:27:26.187 { 00:27:26.187 "name": "BaseBdev2", 00:27:26.187 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:26.187 "is_configured": true, 00:27:26.187 "data_offset": 256, 00:27:26.187 "data_size": 7936 00:27:26.187 } 00:27:26.187 ] 00:27:26.187 }' 00:27:26.187 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.187 13:53:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:26.755 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:26.755 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:26.755 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:26.755 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:26.755 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:26.755 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.755 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.014 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:27.014 "name": "raid_bdev1", 00:27:27.014 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:27.014 "strip_size_kb": 0, 00:27:27.014 "state": "online", 00:27:27.014 "raid_level": "raid1", 00:27:27.014 "superblock": true, 00:27:27.014 "num_base_bdevs": 2, 00:27:27.014 "num_base_bdevs_discovered": 1, 00:27:27.014 "num_base_bdevs_operational": 1, 00:27:27.014 "base_bdevs_list": [ 00:27:27.014 { 00:27:27.014 "name": null, 00:27:27.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.014 "is_configured": false, 00:27:27.014 "data_offset": 256, 00:27:27.014 "data_size": 7936 00:27:27.014 }, 00:27:27.014 { 00:27:27.014 "name": "BaseBdev2", 00:27:27.014 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:27.014 "is_configured": true, 00:27:27.014 "data_offset": 256, 00:27:27.014 "data_size": 7936 00:27:27.014 } 00:27:27.014 ] 00:27:27.014 }' 00:27:27.014 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:27.014 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:27.014 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:27.014 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:27.014 13:53:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:27.583 [2024-07-12 13:53:16.073259] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:27.583 [2024-07-12 13:53:16.078230] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x175a620 00:27:27.583 [2024-07-12 13:53:16.079691] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:27.583 13:53:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:28.963 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:28.963 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:28.963 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:28.963 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:28.963 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:28.963 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.963 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.222 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:29.222 "name": "raid_bdev1", 00:27:29.222 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:29.222 "strip_size_kb": 0, 00:27:29.222 "state": "online", 00:27:29.222 "raid_level": "raid1", 00:27:29.222 "superblock": true, 00:27:29.222 "num_base_bdevs": 2, 00:27:29.222 "num_base_bdevs_discovered": 2, 00:27:29.222 "num_base_bdevs_operational": 2, 00:27:29.222 "process": { 00:27:29.222 "type": "rebuild", 00:27:29.222 "target": "spare", 00:27:29.222 "progress": { 00:27:29.222 "blocks": 3840, 00:27:29.222 "percent": 48 00:27:29.222 } 00:27:29.222 }, 00:27:29.222 "base_bdevs_list": [ 00:27:29.222 { 00:27:29.222 "name": "spare", 00:27:29.222 "uuid": "ccb8df24-b3a6-5ff6-afa3-5f83bb9b642d", 00:27:29.222 "is_configured": true, 00:27:29.222 "data_offset": 256, 00:27:29.222 "data_size": 7936 00:27:29.222 }, 00:27:29.222 { 00:27:29.222 "name": "BaseBdev2", 00:27:29.222 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:29.222 "is_configured": true, 00:27:29.222 "data_offset": 256, 00:27:29.222 "data_size": 7936 00:27:29.222 } 00:27:29.222 ] 00:27:29.222 }' 00:27:29.222 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:29.222 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:29.222 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:29.222 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:29.223 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1061 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.223 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.482 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:29.482 "name": "raid_bdev1", 00:27:29.482 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:29.482 "strip_size_kb": 0, 00:27:29.482 "state": "online", 00:27:29.482 "raid_level": "raid1", 00:27:29.482 "superblock": true, 00:27:29.482 "num_base_bdevs": 2, 00:27:29.482 "num_base_bdevs_discovered": 2, 00:27:29.482 "num_base_bdevs_operational": 2, 00:27:29.482 "process": { 00:27:29.482 "type": "rebuild", 00:27:29.482 "target": "spare", 00:27:29.482 "progress": { 00:27:29.482 "blocks": 4608, 00:27:29.482 "percent": 58 00:27:29.482 } 00:27:29.482 }, 00:27:29.482 "base_bdevs_list": [ 00:27:29.482 { 00:27:29.482 "name": "spare", 00:27:29.482 "uuid": "ccb8df24-b3a6-5ff6-afa3-5f83bb9b642d", 00:27:29.482 "is_configured": true, 00:27:29.482 "data_offset": 256, 00:27:29.482 "data_size": 7936 00:27:29.482 }, 00:27:29.482 { 00:27:29.482 "name": "BaseBdev2", 00:27:29.482 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:29.482 "is_configured": true, 00:27:29.482 "data_offset": 256, 00:27:29.482 "data_size": 7936 00:27:29.482 } 00:27:29.482 ] 00:27:29.482 }' 00:27:29.482 13:53:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:29.741 13:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:29.741 13:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:29.741 13:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:29.741 13:53:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:30.679 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:30.679 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:30.679 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:30.679 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:30.679 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:30.679 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:30.679 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.679 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:30.679 [2024-07-12 13:53:19.203910] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:30.679 [2024-07-12 13:53:19.203972] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:30.679 [2024-07-12 13:53:19.204052] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:31.248 "name": "raid_bdev1", 00:27:31.248 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:31.248 "strip_size_kb": 0, 00:27:31.248 "state": "online", 00:27:31.248 "raid_level": "raid1", 00:27:31.248 "superblock": true, 00:27:31.248 "num_base_bdevs": 2, 00:27:31.248 "num_base_bdevs_discovered": 2, 00:27:31.248 "num_base_bdevs_operational": 2, 00:27:31.248 "base_bdevs_list": [ 00:27:31.248 { 00:27:31.248 "name": "spare", 00:27:31.248 "uuid": "ccb8df24-b3a6-5ff6-afa3-5f83bb9b642d", 00:27:31.248 "is_configured": true, 00:27:31.248 "data_offset": 256, 00:27:31.248 "data_size": 7936 00:27:31.248 }, 00:27:31.248 { 00:27:31.248 "name": "BaseBdev2", 00:27:31.248 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:31.248 "is_configured": true, 00:27:31.248 "data_offset": 256, 00:27:31.248 "data_size": 7936 00:27:31.248 } 00:27:31.248 ] 00:27:31.248 }' 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.248 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.508 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:31.508 "name": "raid_bdev1", 00:27:31.508 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:31.508 "strip_size_kb": 0, 00:27:31.508 "state": "online", 00:27:31.508 "raid_level": "raid1", 00:27:31.508 "superblock": true, 00:27:31.508 "num_base_bdevs": 2, 00:27:31.508 "num_base_bdevs_discovered": 2, 00:27:31.508 "num_base_bdevs_operational": 2, 00:27:31.508 "base_bdevs_list": [ 00:27:31.508 { 00:27:31.508 "name": "spare", 00:27:31.508 "uuid": "ccb8df24-b3a6-5ff6-afa3-5f83bb9b642d", 00:27:31.508 "is_configured": true, 00:27:31.508 "data_offset": 256, 00:27:31.508 "data_size": 7936 00:27:31.508 }, 00:27:31.508 { 00:27:31.508 "name": "BaseBdev2", 00:27:31.508 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:31.508 "is_configured": true, 00:27:31.508 "data_offset": 256, 00:27:31.508 "data_size": 7936 00:27:31.508 } 00:27:31.509 ] 00:27:31.509 }' 00:27:31.509 13:53:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:31.509 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:31.509 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.768 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.028 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:32.028 "name": "raid_bdev1", 00:27:32.028 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:32.028 "strip_size_kb": 0, 00:27:32.028 "state": "online", 00:27:32.028 "raid_level": "raid1", 00:27:32.028 "superblock": true, 00:27:32.028 "num_base_bdevs": 2, 00:27:32.028 "num_base_bdevs_discovered": 2, 00:27:32.028 "num_base_bdevs_operational": 2, 00:27:32.028 "base_bdevs_list": [ 00:27:32.028 { 00:27:32.028 "name": "spare", 00:27:32.028 "uuid": "ccb8df24-b3a6-5ff6-afa3-5f83bb9b642d", 00:27:32.028 "is_configured": true, 00:27:32.028 "data_offset": 256, 00:27:32.028 "data_size": 7936 00:27:32.028 }, 00:27:32.028 { 00:27:32.028 "name": "BaseBdev2", 00:27:32.028 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:32.028 "is_configured": true, 00:27:32.028 "data_offset": 256, 00:27:32.028 "data_size": 7936 00:27:32.028 } 00:27:32.028 ] 00:27:32.028 }' 00:27:32.028 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:32.028 13:53:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:32.965 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:32.965 [2024-07-12 13:53:21.390698] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:32.965 [2024-07-12 13:53:21.390729] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:32.965 [2024-07-12 13:53:21.390787] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:32.965 [2024-07-12 13:53:21.390840] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:32.965 [2024-07-12 13:53:21.390852] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x175a9b0 name raid_bdev1, state offline 00:27:32.965 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.965 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:27:33.225 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:33.225 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:33.225 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:33.225 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:33.226 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:33.226 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:33.226 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:33.226 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:33.226 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:33.226 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:33.226 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:33.226 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:33.226 13:53:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:33.795 /dev/nbd0 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:33.795 1+0 records in 00:27:33.795 1+0 records out 00:27:33.795 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247281 s, 16.6 MB/s 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:33.795 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:34.054 /dev/nbd1 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:34.054 1+0 records in 00:27:34.054 1+0 records out 00:27:34.054 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000335114 s, 12.2 MB/s 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:34.054 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:34.312 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:34.312 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:34.312 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:34.312 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:34.312 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:34.312 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:34.312 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:34.312 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:34.312 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:34.313 13:53:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:34.576 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:34.576 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:34.576 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:34.576 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:34.576 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:34.576 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:34.576 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:34.576 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:34.576 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:34.576 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:34.833 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:35.092 [2024-07-12 13:53:23.457262] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:35.092 [2024-07-12 13:53:23.457309] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:35.092 [2024-07-12 13:53:23.457329] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1759e40 00:27:35.092 [2024-07-12 13:53:23.457348] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:35.092 [2024-07-12 13:53:23.458971] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:35.092 [2024-07-12 13:53:23.459002] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:35.092 [2024-07-12 13:53:23.459080] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:35.092 [2024-07-12 13:53:23.459105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:35.092 [2024-07-12 13:53:23.459206] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:35.092 spare 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:35.092 [2024-07-12 13:53:23.559518] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17575b0 00:27:35.092 [2024-07-12 13:53:23.559535] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:35.092 [2024-07-12 13:53:23.559730] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1753dd0 00:27:35.092 [2024-07-12 13:53:23.559874] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17575b0 00:27:35.092 [2024-07-12 13:53:23.559884] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17575b0 00:27:35.092 [2024-07-12 13:53:23.560010] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:35.092 "name": "raid_bdev1", 00:27:35.092 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:35.092 "strip_size_kb": 0, 00:27:35.092 "state": "online", 00:27:35.092 "raid_level": "raid1", 00:27:35.092 "superblock": true, 00:27:35.092 "num_base_bdevs": 2, 00:27:35.092 "num_base_bdevs_discovered": 2, 00:27:35.092 "num_base_bdevs_operational": 2, 00:27:35.092 "base_bdevs_list": [ 00:27:35.092 { 00:27:35.092 "name": "spare", 00:27:35.092 "uuid": "ccb8df24-b3a6-5ff6-afa3-5f83bb9b642d", 00:27:35.092 "is_configured": true, 00:27:35.092 "data_offset": 256, 00:27:35.092 "data_size": 7936 00:27:35.092 }, 00:27:35.092 { 00:27:35.092 "name": "BaseBdev2", 00:27:35.092 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:35.092 "is_configured": true, 00:27:35.092 "data_offset": 256, 00:27:35.092 "data_size": 7936 00:27:35.092 } 00:27:35.092 ] 00:27:35.092 }' 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:35.092 13:53:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:36.029 "name": "raid_bdev1", 00:27:36.029 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:36.029 "strip_size_kb": 0, 00:27:36.029 "state": "online", 00:27:36.029 "raid_level": "raid1", 00:27:36.029 "superblock": true, 00:27:36.029 "num_base_bdevs": 2, 00:27:36.029 "num_base_bdevs_discovered": 2, 00:27:36.029 "num_base_bdevs_operational": 2, 00:27:36.029 "base_bdevs_list": [ 00:27:36.029 { 00:27:36.029 "name": "spare", 00:27:36.029 "uuid": "ccb8df24-b3a6-5ff6-afa3-5f83bb9b642d", 00:27:36.029 "is_configured": true, 00:27:36.029 "data_offset": 256, 00:27:36.029 "data_size": 7936 00:27:36.029 }, 00:27:36.029 { 00:27:36.029 "name": "BaseBdev2", 00:27:36.029 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:36.029 "is_configured": true, 00:27:36.029 "data_offset": 256, 00:27:36.029 "data_size": 7936 00:27:36.029 } 00:27:36.029 ] 00:27:36.029 }' 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:36.029 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:36.288 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:36.288 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.288 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:36.546 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:36.546 13:53:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:36.546 [2024-07-12 13:53:25.101931] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:36.806 "name": "raid_bdev1", 00:27:36.806 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:36.806 "strip_size_kb": 0, 00:27:36.806 "state": "online", 00:27:36.806 "raid_level": "raid1", 00:27:36.806 "superblock": true, 00:27:36.806 "num_base_bdevs": 2, 00:27:36.806 "num_base_bdevs_discovered": 1, 00:27:36.806 "num_base_bdevs_operational": 1, 00:27:36.806 "base_bdevs_list": [ 00:27:36.806 { 00:27:36.806 "name": null, 00:27:36.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:36.806 "is_configured": false, 00:27:36.806 "data_offset": 256, 00:27:36.806 "data_size": 7936 00:27:36.806 }, 00:27:36.806 { 00:27:36.806 "name": "BaseBdev2", 00:27:36.806 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:36.806 "is_configured": true, 00:27:36.806 "data_offset": 256, 00:27:36.806 "data_size": 7936 00:27:36.806 } 00:27:36.806 ] 00:27:36.806 }' 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:36.806 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:37.742 13:53:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:37.742 [2024-07-12 13:53:26.216895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:37.742 [2024-07-12 13:53:26.217046] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:37.742 [2024-07-12 13:53:26.217063] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:37.742 [2024-07-12 13:53:26.217091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:37.742 [2024-07-12 13:53:26.221934] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1753dd0 00:27:37.742 [2024-07-12 13:53:26.224288] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:37.742 13:53:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:38.678 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:38.678 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:38.678 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:38.678 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:38.678 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:38.678 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.678 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:38.936 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:38.936 "name": "raid_bdev1", 00:27:38.936 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:38.936 "strip_size_kb": 0, 00:27:38.936 "state": "online", 00:27:38.937 "raid_level": "raid1", 00:27:38.937 "superblock": true, 00:27:38.937 "num_base_bdevs": 2, 00:27:38.937 "num_base_bdevs_discovered": 2, 00:27:38.937 "num_base_bdevs_operational": 2, 00:27:38.937 "process": { 00:27:38.937 "type": "rebuild", 00:27:38.937 "target": "spare", 00:27:38.937 "progress": { 00:27:38.937 "blocks": 3072, 00:27:38.937 "percent": 38 00:27:38.937 } 00:27:38.937 }, 00:27:38.937 "base_bdevs_list": [ 00:27:38.937 { 00:27:38.937 "name": "spare", 00:27:38.937 "uuid": "ccb8df24-b3a6-5ff6-afa3-5f83bb9b642d", 00:27:38.937 "is_configured": true, 00:27:38.937 "data_offset": 256, 00:27:38.937 "data_size": 7936 00:27:38.937 }, 00:27:38.937 { 00:27:38.937 "name": "BaseBdev2", 00:27:38.937 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:38.937 "is_configured": true, 00:27:38.937 "data_offset": 256, 00:27:38.937 "data_size": 7936 00:27:38.937 } 00:27:38.937 ] 00:27:38.937 }' 00:27:38.937 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:39.196 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:39.196 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:39.196 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:39.196 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:39.455 [2024-07-12 13:53:27.818610] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:39.455 [2024-07-12 13:53:27.836865] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:39.455 [2024-07-12 13:53:27.836910] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:39.455 [2024-07-12 13:53:27.836934] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:39.455 [2024-07-12 13:53:27.836943] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:39.455 13:53:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.714 13:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:39.714 "name": "raid_bdev1", 00:27:39.714 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:39.714 "strip_size_kb": 0, 00:27:39.714 "state": "online", 00:27:39.714 "raid_level": "raid1", 00:27:39.714 "superblock": true, 00:27:39.714 "num_base_bdevs": 2, 00:27:39.714 "num_base_bdevs_discovered": 1, 00:27:39.714 "num_base_bdevs_operational": 1, 00:27:39.714 "base_bdevs_list": [ 00:27:39.714 { 00:27:39.714 "name": null, 00:27:39.714 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:39.714 "is_configured": false, 00:27:39.714 "data_offset": 256, 00:27:39.714 "data_size": 7936 00:27:39.714 }, 00:27:39.714 { 00:27:39.714 "name": "BaseBdev2", 00:27:39.714 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:39.714 "is_configured": true, 00:27:39.714 "data_offset": 256, 00:27:39.714 "data_size": 7936 00:27:39.714 } 00:27:39.714 ] 00:27:39.714 }' 00:27:39.714 13:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:39.714 13:53:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:40.280 13:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:40.539 [2024-07-12 13:53:28.928098] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:40.539 [2024-07-12 13:53:28.928147] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:40.539 [2024-07-12 13:53:28.928173] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1757930 00:27:40.539 [2024-07-12 13:53:28.928186] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:40.539 [2024-07-12 13:53:28.928555] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:40.539 [2024-07-12 13:53:28.928575] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:40.539 [2024-07-12 13:53:28.928652] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:40.539 [2024-07-12 13:53:28.928664] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:40.539 [2024-07-12 13:53:28.928675] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:40.539 [2024-07-12 13:53:28.928693] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:40.539 [2024-07-12 13:53:28.933559] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x175cd10 00:27:40.539 spare 00:27:40.539 [2024-07-12 13:53:28.935009] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:40.539 13:53:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:41.484 13:53:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:41.484 13:53:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:41.484 13:53:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:41.484 13:53:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:41.484 13:53:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:41.484 13:53:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.484 13:53:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.742 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:41.742 "name": "raid_bdev1", 00:27:41.742 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:41.742 "strip_size_kb": 0, 00:27:41.742 "state": "online", 00:27:41.742 "raid_level": "raid1", 00:27:41.742 "superblock": true, 00:27:41.742 "num_base_bdevs": 2, 00:27:41.742 "num_base_bdevs_discovered": 2, 00:27:41.742 "num_base_bdevs_operational": 2, 00:27:41.742 "process": { 00:27:41.742 "type": "rebuild", 00:27:41.742 "target": "spare", 00:27:41.742 "progress": { 00:27:41.742 "blocks": 3072, 00:27:41.742 "percent": 38 00:27:41.742 } 00:27:41.742 }, 00:27:41.742 "base_bdevs_list": [ 00:27:41.742 { 00:27:41.742 "name": "spare", 00:27:41.742 "uuid": "ccb8df24-b3a6-5ff6-afa3-5f83bb9b642d", 00:27:41.742 "is_configured": true, 00:27:41.742 "data_offset": 256, 00:27:41.742 "data_size": 7936 00:27:41.742 }, 00:27:41.742 { 00:27:41.742 "name": "BaseBdev2", 00:27:41.742 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:41.742 "is_configured": true, 00:27:41.742 "data_offset": 256, 00:27:41.742 "data_size": 7936 00:27:41.742 } 00:27:41.742 ] 00:27:41.742 }' 00:27:41.742 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:41.742 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:41.742 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:41.742 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:41.742 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:42.001 [2024-07-12 13:53:30.490134] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:42.001 [2024-07-12 13:53:30.547918] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:42.001 [2024-07-12 13:53:30.547973] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:42.001 [2024-07-12 13:53:30.547988] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:42.001 [2024-07-12 13:53:30.547996] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:42.001 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:42.001 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:42.001 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:42.001 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:42.001 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:42.001 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:42.001 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:42.001 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:42.001 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:42.001 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:42.260 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.260 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:42.260 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:42.260 "name": "raid_bdev1", 00:27:42.260 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:42.260 "strip_size_kb": 0, 00:27:42.260 "state": "online", 00:27:42.260 "raid_level": "raid1", 00:27:42.260 "superblock": true, 00:27:42.260 "num_base_bdevs": 2, 00:27:42.260 "num_base_bdevs_discovered": 1, 00:27:42.260 "num_base_bdevs_operational": 1, 00:27:42.260 "base_bdevs_list": [ 00:27:42.260 { 00:27:42.260 "name": null, 00:27:42.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:42.260 "is_configured": false, 00:27:42.260 "data_offset": 256, 00:27:42.260 "data_size": 7936 00:27:42.260 }, 00:27:42.260 { 00:27:42.260 "name": "BaseBdev2", 00:27:42.260 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:42.260 "is_configured": true, 00:27:42.260 "data_offset": 256, 00:27:42.260 "data_size": 7936 00:27:42.260 } 00:27:42.260 ] 00:27:42.260 }' 00:27:42.260 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:42.260 13:53:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:43.193 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:43.193 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:43.193 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:43.193 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:43.193 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:43.193 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.193 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.194 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:43.194 "name": "raid_bdev1", 00:27:43.194 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:43.194 "strip_size_kb": 0, 00:27:43.194 "state": "online", 00:27:43.194 "raid_level": "raid1", 00:27:43.194 "superblock": true, 00:27:43.194 "num_base_bdevs": 2, 00:27:43.194 "num_base_bdevs_discovered": 1, 00:27:43.194 "num_base_bdevs_operational": 1, 00:27:43.194 "base_bdevs_list": [ 00:27:43.194 { 00:27:43.194 "name": null, 00:27:43.194 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.194 "is_configured": false, 00:27:43.194 "data_offset": 256, 00:27:43.194 "data_size": 7936 00:27:43.194 }, 00:27:43.194 { 00:27:43.194 "name": "BaseBdev2", 00:27:43.194 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:43.194 "is_configured": true, 00:27:43.194 "data_offset": 256, 00:27:43.194 "data_size": 7936 00:27:43.194 } 00:27:43.194 ] 00:27:43.194 }' 00:27:43.194 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:43.194 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:43.194 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:43.452 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:43.452 13:53:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:43.710 13:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:43.710 [2024-07-12 13:53:32.253728] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:43.710 [2024-07-12 13:53:32.253778] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:43.710 [2024-07-12 13:53:32.253801] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x175bf50 00:27:43.710 [2024-07-12 13:53:32.253815] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:43.710 [2024-07-12 13:53:32.254168] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:43.710 [2024-07-12 13:53:32.254188] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:43.710 [2024-07-12 13:53:32.254253] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:43.710 [2024-07-12 13:53:32.254266] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:43.710 [2024-07-12 13:53:32.254285] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:43.710 BaseBdev1 00:27:43.710 13:53:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.084 "name": "raid_bdev1", 00:27:45.084 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:45.084 "strip_size_kb": 0, 00:27:45.084 "state": "online", 00:27:45.084 "raid_level": "raid1", 00:27:45.084 "superblock": true, 00:27:45.084 "num_base_bdevs": 2, 00:27:45.084 "num_base_bdevs_discovered": 1, 00:27:45.084 "num_base_bdevs_operational": 1, 00:27:45.084 "base_bdevs_list": [ 00:27:45.084 { 00:27:45.084 "name": null, 00:27:45.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:45.084 "is_configured": false, 00:27:45.084 "data_offset": 256, 00:27:45.084 "data_size": 7936 00:27:45.084 }, 00:27:45.084 { 00:27:45.084 "name": "BaseBdev2", 00:27:45.084 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:45.084 "is_configured": true, 00:27:45.084 "data_offset": 256, 00:27:45.084 "data_size": 7936 00:27:45.084 } 00:27:45.084 ] 00:27:45.084 }' 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.084 13:53:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:45.650 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:45.650 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:45.650 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:45.650 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:45.650 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:45.650 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.650 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:45.909 "name": "raid_bdev1", 00:27:45.909 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:45.909 "strip_size_kb": 0, 00:27:45.909 "state": "online", 00:27:45.909 "raid_level": "raid1", 00:27:45.909 "superblock": true, 00:27:45.909 "num_base_bdevs": 2, 00:27:45.909 "num_base_bdevs_discovered": 1, 00:27:45.909 "num_base_bdevs_operational": 1, 00:27:45.909 "base_bdevs_list": [ 00:27:45.909 { 00:27:45.909 "name": null, 00:27:45.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:45.909 "is_configured": false, 00:27:45.909 "data_offset": 256, 00:27:45.909 "data_size": 7936 00:27:45.909 }, 00:27:45.909 { 00:27:45.909 "name": "BaseBdev2", 00:27:45.909 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:45.909 "is_configured": true, 00:27:45.909 "data_offset": 256, 00:27:45.909 "data_size": 7936 00:27:45.909 } 00:27:45.909 ] 00:27:45.909 }' 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:45.909 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:46.168 [2024-07-12 13:53:34.692209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:46.168 [2024-07-12 13:53:34.692336] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:46.168 [2024-07-12 13:53:34.692351] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:46.168 request: 00:27:46.168 { 00:27:46.168 "base_bdev": "BaseBdev1", 00:27:46.168 "raid_bdev": "raid_bdev1", 00:27:46.168 "method": "bdev_raid_add_base_bdev", 00:27:46.168 "req_id": 1 00:27:46.168 } 00:27:46.168 Got JSON-RPC error response 00:27:46.168 response: 00:27:46.168 { 00:27:46.168 "code": -22, 00:27:46.168 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:46.168 } 00:27:46.168 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:27:46.168 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:46.168 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:46.168 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:46.168 13:53:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:47.543 "name": "raid_bdev1", 00:27:47.543 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:47.543 "strip_size_kb": 0, 00:27:47.543 "state": "online", 00:27:47.543 "raid_level": "raid1", 00:27:47.543 "superblock": true, 00:27:47.543 "num_base_bdevs": 2, 00:27:47.543 "num_base_bdevs_discovered": 1, 00:27:47.543 "num_base_bdevs_operational": 1, 00:27:47.543 "base_bdevs_list": [ 00:27:47.543 { 00:27:47.543 "name": null, 00:27:47.543 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:47.543 "is_configured": false, 00:27:47.543 "data_offset": 256, 00:27:47.543 "data_size": 7936 00:27:47.543 }, 00:27:47.543 { 00:27:47.543 "name": "BaseBdev2", 00:27:47.543 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:47.543 "is_configured": true, 00:27:47.543 "data_offset": 256, 00:27:47.543 "data_size": 7936 00:27:47.543 } 00:27:47.543 ] 00:27:47.543 }' 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:47.543 13:53:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:48.110 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:48.110 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:48.110 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:48.110 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:48.110 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:48.110 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:48.110 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.369 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:48.369 "name": "raid_bdev1", 00:27:48.369 "uuid": "ce990824-29e4-4659-9eed-f5f89d83c599", 00:27:48.369 "strip_size_kb": 0, 00:27:48.369 "state": "online", 00:27:48.369 "raid_level": "raid1", 00:27:48.369 "superblock": true, 00:27:48.369 "num_base_bdevs": 2, 00:27:48.369 "num_base_bdevs_discovered": 1, 00:27:48.369 "num_base_bdevs_operational": 1, 00:27:48.369 "base_bdevs_list": [ 00:27:48.369 { 00:27:48.369 "name": null, 00:27:48.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.369 "is_configured": false, 00:27:48.369 "data_offset": 256, 00:27:48.369 "data_size": 7936 00:27:48.369 }, 00:27:48.369 { 00:27:48.369 "name": "BaseBdev2", 00:27:48.369 "uuid": "112543ec-58b8-5163-a10a-c604f5cbb29d", 00:27:48.369 "is_configured": true, 00:27:48.369 "data_offset": 256, 00:27:48.369 "data_size": 7936 00:27:48.369 } 00:27:48.369 ] 00:27:48.369 }' 00:27:48.369 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:48.369 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:48.369 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:48.629 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:48.629 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 575493 00:27:48.629 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 575493 ']' 00:27:48.629 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 575493 00:27:48.629 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:48.629 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:48.629 13:53:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 575493 00:27:48.629 13:53:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:48.629 13:53:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:48.629 13:53:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 575493' 00:27:48.629 killing process with pid 575493 00:27:48.629 13:53:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 575493 00:27:48.629 Received shutdown signal, test time was about 60.000000 seconds 00:27:48.629 00:27:48.629 Latency(us) 00:27:48.629 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:48.629 =================================================================================================================== 00:27:48.629 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:48.629 [2024-07-12 13:53:37.040365] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:48.629 [2024-07-12 13:53:37.040455] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:48.629 [2024-07-12 13:53:37.040499] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:48.629 [2024-07-12 13:53:37.040511] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17575b0 name raid_bdev1, state offline 00:27:48.629 13:53:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 575493 00:27:48.629 [2024-07-12 13:53:37.071952] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:48.889 13:53:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:27:48.889 00:27:48.889 real 0m31.585s 00:27:48.889 user 0m50.210s 00:27:48.889 sys 0m5.275s 00:27:48.889 13:53:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:48.889 13:53:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:48.889 ************************************ 00:27:48.889 END TEST raid_rebuild_test_sb_4k 00:27:48.889 ************************************ 00:27:48.889 13:53:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:48.889 13:53:37 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:27:48.889 13:53:37 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:27:48.889 13:53:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:48.889 13:53:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:48.889 13:53:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:48.889 ************************************ 00:27:48.889 START TEST raid_state_function_test_sb_md_separate 00:27:48.889 ************************************ 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=579970 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 579970' 00:27:48.889 Process raid pid: 579970 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 579970 /var/tmp/spdk-raid.sock 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 579970 ']' 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:48.889 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:48.889 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:48.889 [2024-07-12 13:53:37.448650] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:27:48.889 [2024-07-12 13:53:37.448716] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:49.149 [2024-07-12 13:53:37.577124] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:49.149 [2024-07-12 13:53:37.687272] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:49.408 [2024-07-12 13:53:37.753983] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:49.408 [2024-07-12 13:53:37.754012] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:49.408 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:49.408 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:49.408 13:53:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:49.693 [2024-07-12 13:53:38.141484] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:49.693 [2024-07-12 13:53:38.141526] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:49.693 [2024-07-12 13:53:38.141538] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:49.693 [2024-07-12 13:53:38.141549] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.693 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:49.980 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:49.980 "name": "Existed_Raid", 00:27:49.980 "uuid": "0d74f1a3-4ec2-4544-9635-4af3652672f9", 00:27:49.980 "strip_size_kb": 0, 00:27:49.980 "state": "configuring", 00:27:49.980 "raid_level": "raid1", 00:27:49.980 "superblock": true, 00:27:49.980 "num_base_bdevs": 2, 00:27:49.980 "num_base_bdevs_discovered": 0, 00:27:49.980 "num_base_bdevs_operational": 2, 00:27:49.980 "base_bdevs_list": [ 00:27:49.980 { 00:27:49.980 "name": "BaseBdev1", 00:27:49.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:49.980 "is_configured": false, 00:27:49.980 "data_offset": 0, 00:27:49.980 "data_size": 0 00:27:49.980 }, 00:27:49.980 { 00:27:49.980 "name": "BaseBdev2", 00:27:49.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:49.980 "is_configured": false, 00:27:49.980 "data_offset": 0, 00:27:49.980 "data_size": 0 00:27:49.980 } 00:27:49.980 ] 00:27:49.980 }' 00:27:49.980 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:49.980 13:53:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:50.548 13:53:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:50.807 [2024-07-12 13:53:39.284373] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:50.807 [2024-07-12 13:53:39.284405] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1745330 name Existed_Raid, state configuring 00:27:50.807 13:53:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:51.066 [2024-07-12 13:53:39.533053] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:51.066 [2024-07-12 13:53:39.533083] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:51.066 [2024-07-12 13:53:39.533093] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:51.066 [2024-07-12 13:53:39.533104] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:51.066 13:53:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:27:51.325 [2024-07-12 13:53:39.788045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:51.325 BaseBdev1 00:27:51.325 13:53:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:51.325 13:53:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:51.325 13:53:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:51.325 13:53:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:51.325 13:53:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:51.325 13:53:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:51.325 13:53:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:51.585 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:51.843 [ 00:27:51.843 { 00:27:51.843 "name": "BaseBdev1", 00:27:51.843 "aliases": [ 00:27:51.843 "421b8485-938f-4098-a434-60f9c693759f" 00:27:51.843 ], 00:27:51.843 "product_name": "Malloc disk", 00:27:51.843 "block_size": 4096, 00:27:51.843 "num_blocks": 8192, 00:27:51.843 "uuid": "421b8485-938f-4098-a434-60f9c693759f", 00:27:51.843 "md_size": 32, 00:27:51.843 "md_interleave": false, 00:27:51.843 "dif_type": 0, 00:27:51.843 "assigned_rate_limits": { 00:27:51.843 "rw_ios_per_sec": 0, 00:27:51.843 "rw_mbytes_per_sec": 0, 00:27:51.843 "r_mbytes_per_sec": 0, 00:27:51.843 "w_mbytes_per_sec": 0 00:27:51.843 }, 00:27:51.843 "claimed": true, 00:27:51.843 "claim_type": "exclusive_write", 00:27:51.843 "zoned": false, 00:27:51.843 "supported_io_types": { 00:27:51.843 "read": true, 00:27:51.843 "write": true, 00:27:51.843 "unmap": true, 00:27:51.843 "flush": true, 00:27:51.843 "reset": true, 00:27:51.843 "nvme_admin": false, 00:27:51.843 "nvme_io": false, 00:27:51.843 "nvme_io_md": false, 00:27:51.843 "write_zeroes": true, 00:27:51.843 "zcopy": true, 00:27:51.843 "get_zone_info": false, 00:27:51.843 "zone_management": false, 00:27:51.843 "zone_append": false, 00:27:51.843 "compare": false, 00:27:51.843 "compare_and_write": false, 00:27:51.843 "abort": true, 00:27:51.843 "seek_hole": false, 00:27:51.843 "seek_data": false, 00:27:51.843 "copy": true, 00:27:51.843 "nvme_iov_md": false 00:27:51.843 }, 00:27:51.843 "memory_domains": [ 00:27:51.843 { 00:27:51.843 "dma_device_id": "system", 00:27:51.843 "dma_device_type": 1 00:27:51.843 }, 00:27:51.843 { 00:27:51.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:51.843 "dma_device_type": 2 00:27:51.843 } 00:27:51.843 ], 00:27:51.843 "driver_specific": {} 00:27:51.843 } 00:27:51.843 ] 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.843 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:52.102 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.102 "name": "Existed_Raid", 00:27:52.102 "uuid": "83829f38-dbb8-4dd2-a2de-3f2c9ceed5cf", 00:27:52.102 "strip_size_kb": 0, 00:27:52.102 "state": "configuring", 00:27:52.102 "raid_level": "raid1", 00:27:52.102 "superblock": true, 00:27:52.102 "num_base_bdevs": 2, 00:27:52.102 "num_base_bdevs_discovered": 1, 00:27:52.102 "num_base_bdevs_operational": 2, 00:27:52.102 "base_bdevs_list": [ 00:27:52.102 { 00:27:52.102 "name": "BaseBdev1", 00:27:52.102 "uuid": "421b8485-938f-4098-a434-60f9c693759f", 00:27:52.102 "is_configured": true, 00:27:52.102 "data_offset": 256, 00:27:52.102 "data_size": 7936 00:27:52.102 }, 00:27:52.102 { 00:27:52.102 "name": "BaseBdev2", 00:27:52.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:52.102 "is_configured": false, 00:27:52.102 "data_offset": 0, 00:27:52.102 "data_size": 0 00:27:52.102 } 00:27:52.102 ] 00:27:52.102 }' 00:27:52.102 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.102 13:53:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:52.670 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:52.929 [2024-07-12 13:53:41.376307] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:52.929 [2024-07-12 13:53:41.376346] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1744c20 name Existed_Raid, state configuring 00:27:52.929 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:53.188 [2024-07-12 13:53:41.625010] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:53.188 [2024-07-12 13:53:41.626433] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:53.188 [2024-07-12 13:53:41.626465] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:53.188 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:53.188 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:53.188 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:53.188 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:53.188 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:53.188 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:53.188 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:53.188 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:53.188 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:53.188 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:53.189 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:53.189 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:53.189 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.189 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:53.447 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:53.447 "name": "Existed_Raid", 00:27:53.447 "uuid": "cb220900-7555-45c7-b50d-bc02a1734665", 00:27:53.447 "strip_size_kb": 0, 00:27:53.447 "state": "configuring", 00:27:53.447 "raid_level": "raid1", 00:27:53.447 "superblock": true, 00:27:53.447 "num_base_bdevs": 2, 00:27:53.447 "num_base_bdevs_discovered": 1, 00:27:53.447 "num_base_bdevs_operational": 2, 00:27:53.447 "base_bdevs_list": [ 00:27:53.447 { 00:27:53.447 "name": "BaseBdev1", 00:27:53.447 "uuid": "421b8485-938f-4098-a434-60f9c693759f", 00:27:53.447 "is_configured": true, 00:27:53.447 "data_offset": 256, 00:27:53.447 "data_size": 7936 00:27:53.447 }, 00:27:53.447 { 00:27:53.447 "name": "BaseBdev2", 00:27:53.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:53.447 "is_configured": false, 00:27:53.447 "data_offset": 0, 00:27:53.447 "data_size": 0 00:27:53.447 } 00:27:53.447 ] 00:27:53.447 }' 00:27:53.447 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:53.447 13:53:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:54.015 13:53:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:27:54.274 [2024-07-12 13:53:42.663919] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:54.274 [2024-07-12 13:53:42.664067] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1746b50 00:27:54.274 [2024-07-12 13:53:42.664080] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:54.274 [2024-07-12 13:53:42.664141] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17451c0 00:27:54.274 [2024-07-12 13:53:42.664240] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1746b50 00:27:54.274 [2024-07-12 13:53:42.664250] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1746b50 00:27:54.274 [2024-07-12 13:53:42.664317] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:54.274 BaseBdev2 00:27:54.274 13:53:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:54.274 13:53:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:54.274 13:53:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:54.274 13:53:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:54.274 13:53:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:54.274 13:53:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:54.274 13:53:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:54.534 13:53:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:54.534 [ 00:27:54.534 { 00:27:54.534 "name": "BaseBdev2", 00:27:54.534 "aliases": [ 00:27:54.534 "c06746d0-8ed7-4002-a102-ed3807ce1e8c" 00:27:54.534 ], 00:27:54.534 "product_name": "Malloc disk", 00:27:54.534 "block_size": 4096, 00:27:54.534 "num_blocks": 8192, 00:27:54.534 "uuid": "c06746d0-8ed7-4002-a102-ed3807ce1e8c", 00:27:54.534 "md_size": 32, 00:27:54.534 "md_interleave": false, 00:27:54.534 "dif_type": 0, 00:27:54.534 "assigned_rate_limits": { 00:27:54.534 "rw_ios_per_sec": 0, 00:27:54.534 "rw_mbytes_per_sec": 0, 00:27:54.534 "r_mbytes_per_sec": 0, 00:27:54.534 "w_mbytes_per_sec": 0 00:27:54.534 }, 00:27:54.534 "claimed": true, 00:27:54.534 "claim_type": "exclusive_write", 00:27:54.534 "zoned": false, 00:27:54.534 "supported_io_types": { 00:27:54.534 "read": true, 00:27:54.534 "write": true, 00:27:54.534 "unmap": true, 00:27:54.534 "flush": true, 00:27:54.534 "reset": true, 00:27:54.534 "nvme_admin": false, 00:27:54.534 "nvme_io": false, 00:27:54.534 "nvme_io_md": false, 00:27:54.534 "write_zeroes": true, 00:27:54.534 "zcopy": true, 00:27:54.534 "get_zone_info": false, 00:27:54.534 "zone_management": false, 00:27:54.534 "zone_append": false, 00:27:54.534 "compare": false, 00:27:54.534 "compare_and_write": false, 00:27:54.534 "abort": true, 00:27:54.534 "seek_hole": false, 00:27:54.534 "seek_data": false, 00:27:54.534 "copy": true, 00:27:54.534 "nvme_iov_md": false 00:27:54.534 }, 00:27:54.534 "memory_domains": [ 00:27:54.534 { 00:27:54.534 "dma_device_id": "system", 00:27:54.534 "dma_device_type": 1 00:27:54.534 }, 00:27:54.534 { 00:27:54.534 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:54.534 "dma_device_type": 2 00:27:54.534 } 00:27:54.534 ], 00:27:54.534 "driver_specific": {} 00:27:54.534 } 00:27:54.534 ] 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.793 "name": "Existed_Raid", 00:27:54.793 "uuid": "cb220900-7555-45c7-b50d-bc02a1734665", 00:27:54.793 "strip_size_kb": 0, 00:27:54.793 "state": "online", 00:27:54.793 "raid_level": "raid1", 00:27:54.793 "superblock": true, 00:27:54.793 "num_base_bdevs": 2, 00:27:54.793 "num_base_bdevs_discovered": 2, 00:27:54.793 "num_base_bdevs_operational": 2, 00:27:54.793 "base_bdevs_list": [ 00:27:54.793 { 00:27:54.793 "name": "BaseBdev1", 00:27:54.793 "uuid": "421b8485-938f-4098-a434-60f9c693759f", 00:27:54.793 "is_configured": true, 00:27:54.793 "data_offset": 256, 00:27:54.793 "data_size": 7936 00:27:54.793 }, 00:27:54.793 { 00:27:54.793 "name": "BaseBdev2", 00:27:54.793 "uuid": "c06746d0-8ed7-4002-a102-ed3807ce1e8c", 00:27:54.793 "is_configured": true, 00:27:54.793 "data_offset": 256, 00:27:54.793 "data_size": 7936 00:27:54.793 } 00:27:54.793 ] 00:27:54.793 }' 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.793 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:55.359 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:55.359 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:55.359 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:55.359 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:55.359 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:55.359 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:55.359 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:55.359 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:55.618 [2024-07-12 13:53:43.967683] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:55.618 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:55.618 "name": "Existed_Raid", 00:27:55.618 "aliases": [ 00:27:55.618 "cb220900-7555-45c7-b50d-bc02a1734665" 00:27:55.618 ], 00:27:55.618 "product_name": "Raid Volume", 00:27:55.618 "block_size": 4096, 00:27:55.618 "num_blocks": 7936, 00:27:55.618 "uuid": "cb220900-7555-45c7-b50d-bc02a1734665", 00:27:55.618 "md_size": 32, 00:27:55.618 "md_interleave": false, 00:27:55.618 "dif_type": 0, 00:27:55.618 "assigned_rate_limits": { 00:27:55.618 "rw_ios_per_sec": 0, 00:27:55.618 "rw_mbytes_per_sec": 0, 00:27:55.618 "r_mbytes_per_sec": 0, 00:27:55.618 "w_mbytes_per_sec": 0 00:27:55.618 }, 00:27:55.618 "claimed": false, 00:27:55.618 "zoned": false, 00:27:55.618 "supported_io_types": { 00:27:55.618 "read": true, 00:27:55.618 "write": true, 00:27:55.618 "unmap": false, 00:27:55.618 "flush": false, 00:27:55.618 "reset": true, 00:27:55.618 "nvme_admin": false, 00:27:55.618 "nvme_io": false, 00:27:55.618 "nvme_io_md": false, 00:27:55.618 "write_zeroes": true, 00:27:55.618 "zcopy": false, 00:27:55.618 "get_zone_info": false, 00:27:55.618 "zone_management": false, 00:27:55.618 "zone_append": false, 00:27:55.618 "compare": false, 00:27:55.618 "compare_and_write": false, 00:27:55.618 "abort": false, 00:27:55.618 "seek_hole": false, 00:27:55.618 "seek_data": false, 00:27:55.618 "copy": false, 00:27:55.618 "nvme_iov_md": false 00:27:55.618 }, 00:27:55.618 "memory_domains": [ 00:27:55.618 { 00:27:55.618 "dma_device_id": "system", 00:27:55.618 "dma_device_type": 1 00:27:55.618 }, 00:27:55.618 { 00:27:55.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:55.618 "dma_device_type": 2 00:27:55.618 }, 00:27:55.618 { 00:27:55.618 "dma_device_id": "system", 00:27:55.618 "dma_device_type": 1 00:27:55.618 }, 00:27:55.618 { 00:27:55.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:55.618 "dma_device_type": 2 00:27:55.618 } 00:27:55.618 ], 00:27:55.618 "driver_specific": { 00:27:55.618 "raid": { 00:27:55.618 "uuid": "cb220900-7555-45c7-b50d-bc02a1734665", 00:27:55.618 "strip_size_kb": 0, 00:27:55.618 "state": "online", 00:27:55.618 "raid_level": "raid1", 00:27:55.618 "superblock": true, 00:27:55.618 "num_base_bdevs": 2, 00:27:55.618 "num_base_bdevs_discovered": 2, 00:27:55.618 "num_base_bdevs_operational": 2, 00:27:55.618 "base_bdevs_list": [ 00:27:55.618 { 00:27:55.618 "name": "BaseBdev1", 00:27:55.618 "uuid": "421b8485-938f-4098-a434-60f9c693759f", 00:27:55.618 "is_configured": true, 00:27:55.618 "data_offset": 256, 00:27:55.618 "data_size": 7936 00:27:55.618 }, 00:27:55.618 { 00:27:55.618 "name": "BaseBdev2", 00:27:55.618 "uuid": "c06746d0-8ed7-4002-a102-ed3807ce1e8c", 00:27:55.618 "is_configured": true, 00:27:55.618 "data_offset": 256, 00:27:55.618 "data_size": 7936 00:27:55.618 } 00:27:55.618 ] 00:27:55.618 } 00:27:55.618 } 00:27:55.618 }' 00:27:55.618 13:53:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:55.618 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:55.618 BaseBdev2' 00:27:55.618 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:55.618 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:55.618 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:55.876 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:55.876 "name": "BaseBdev1", 00:27:55.876 "aliases": [ 00:27:55.876 "421b8485-938f-4098-a434-60f9c693759f" 00:27:55.876 ], 00:27:55.876 "product_name": "Malloc disk", 00:27:55.876 "block_size": 4096, 00:27:55.876 "num_blocks": 8192, 00:27:55.876 "uuid": "421b8485-938f-4098-a434-60f9c693759f", 00:27:55.876 "md_size": 32, 00:27:55.876 "md_interleave": false, 00:27:55.876 "dif_type": 0, 00:27:55.876 "assigned_rate_limits": { 00:27:55.876 "rw_ios_per_sec": 0, 00:27:55.876 "rw_mbytes_per_sec": 0, 00:27:55.876 "r_mbytes_per_sec": 0, 00:27:55.876 "w_mbytes_per_sec": 0 00:27:55.876 }, 00:27:55.876 "claimed": true, 00:27:55.876 "claim_type": "exclusive_write", 00:27:55.876 "zoned": false, 00:27:55.876 "supported_io_types": { 00:27:55.876 "read": true, 00:27:55.877 "write": true, 00:27:55.877 "unmap": true, 00:27:55.877 "flush": true, 00:27:55.877 "reset": true, 00:27:55.877 "nvme_admin": false, 00:27:55.877 "nvme_io": false, 00:27:55.877 "nvme_io_md": false, 00:27:55.877 "write_zeroes": true, 00:27:55.877 "zcopy": true, 00:27:55.877 "get_zone_info": false, 00:27:55.877 "zone_management": false, 00:27:55.877 "zone_append": false, 00:27:55.877 "compare": false, 00:27:55.877 "compare_and_write": false, 00:27:55.877 "abort": true, 00:27:55.877 "seek_hole": false, 00:27:55.877 "seek_data": false, 00:27:55.877 "copy": true, 00:27:55.877 "nvme_iov_md": false 00:27:55.877 }, 00:27:55.877 "memory_domains": [ 00:27:55.877 { 00:27:55.877 "dma_device_id": "system", 00:27:55.877 "dma_device_type": 1 00:27:55.877 }, 00:27:55.877 { 00:27:55.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:55.877 "dma_device_type": 2 00:27:55.877 } 00:27:55.877 ], 00:27:55.877 "driver_specific": {} 00:27:55.877 }' 00:27:55.877 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:55.877 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:55.877 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:55.877 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:56.135 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:56.135 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:56.135 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:56.135 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:56.135 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:56.135 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:56.393 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:56.393 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:56.393 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:56.393 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:56.393 13:53:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:56.959 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:56.959 "name": "BaseBdev2", 00:27:56.959 "aliases": [ 00:27:56.959 "c06746d0-8ed7-4002-a102-ed3807ce1e8c" 00:27:56.959 ], 00:27:56.959 "product_name": "Malloc disk", 00:27:56.959 "block_size": 4096, 00:27:56.959 "num_blocks": 8192, 00:27:56.959 "uuid": "c06746d0-8ed7-4002-a102-ed3807ce1e8c", 00:27:56.959 "md_size": 32, 00:27:56.959 "md_interleave": false, 00:27:56.959 "dif_type": 0, 00:27:56.959 "assigned_rate_limits": { 00:27:56.959 "rw_ios_per_sec": 0, 00:27:56.959 "rw_mbytes_per_sec": 0, 00:27:56.959 "r_mbytes_per_sec": 0, 00:27:56.959 "w_mbytes_per_sec": 0 00:27:56.959 }, 00:27:56.959 "claimed": true, 00:27:56.959 "claim_type": "exclusive_write", 00:27:56.959 "zoned": false, 00:27:56.959 "supported_io_types": { 00:27:56.959 "read": true, 00:27:56.959 "write": true, 00:27:56.959 "unmap": true, 00:27:56.959 "flush": true, 00:27:56.959 "reset": true, 00:27:56.959 "nvme_admin": false, 00:27:56.959 "nvme_io": false, 00:27:56.959 "nvme_io_md": false, 00:27:56.959 "write_zeroes": true, 00:27:56.959 "zcopy": true, 00:27:56.959 "get_zone_info": false, 00:27:56.959 "zone_management": false, 00:27:56.959 "zone_append": false, 00:27:56.959 "compare": false, 00:27:56.959 "compare_and_write": false, 00:27:56.959 "abort": true, 00:27:56.959 "seek_hole": false, 00:27:56.959 "seek_data": false, 00:27:56.959 "copy": true, 00:27:56.959 "nvme_iov_md": false 00:27:56.959 }, 00:27:56.959 "memory_domains": [ 00:27:56.959 { 00:27:56.959 "dma_device_id": "system", 00:27:56.959 "dma_device_type": 1 00:27:56.959 }, 00:27:56.959 { 00:27:56.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:56.959 "dma_device_type": 2 00:27:56.959 } 00:27:56.959 ], 00:27:56.959 "driver_specific": {} 00:27:56.959 }' 00:27:56.959 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:56.959 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:56.959 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:56.959 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:56.959 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:57.218 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:57.218 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:57.218 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:57.218 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:57.218 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:57.475 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:57.475 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:57.475 13:53:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:58.043 [2024-07-12 13:53:46.369834] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.043 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:58.611 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:58.611 "name": "Existed_Raid", 00:27:58.611 "uuid": "cb220900-7555-45c7-b50d-bc02a1734665", 00:27:58.611 "strip_size_kb": 0, 00:27:58.611 "state": "online", 00:27:58.611 "raid_level": "raid1", 00:27:58.611 "superblock": true, 00:27:58.611 "num_base_bdevs": 2, 00:27:58.611 "num_base_bdevs_discovered": 1, 00:27:58.611 "num_base_bdevs_operational": 1, 00:27:58.611 "base_bdevs_list": [ 00:27:58.611 { 00:27:58.611 "name": null, 00:27:58.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:58.611 "is_configured": false, 00:27:58.611 "data_offset": 256, 00:27:58.611 "data_size": 7936 00:27:58.611 }, 00:27:58.611 { 00:27:58.611 "name": "BaseBdev2", 00:27:58.611 "uuid": "c06746d0-8ed7-4002-a102-ed3807ce1e8c", 00:27:58.611 "is_configured": true, 00:27:58.611 "data_offset": 256, 00:27:58.611 "data_size": 7936 00:27:58.611 } 00:27:58.611 ] 00:27:58.611 }' 00:27:58.611 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:58.611 13:53:46 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:59.549 13:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:59.549 13:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:59.549 13:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.549 13:53:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:59.549 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:59.549 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:59.549 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:59.809 [2024-07-12 13:53:48.327383] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:59.809 [2024-07-12 13:53:48.327468] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:59.809 [2024-07-12 13:53:48.340838] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:59.809 [2024-07-12 13:53:48.340872] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:59.809 [2024-07-12 13:53:48.340884] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1746b50 name Existed_Raid, state offline 00:27:59.809 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:59.809 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:59.809 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.809 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:00.069 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:00.069 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:00.069 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:00.069 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 579970 00:28:00.069 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 579970 ']' 00:28:00.069 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 579970 00:28:00.069 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:00.069 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:00.069 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 579970 00:28:00.328 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:00.328 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:00.328 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 579970' 00:28:00.328 killing process with pid 579970 00:28:00.328 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 579970 00:28:00.328 [2024-07-12 13:53:48.661071] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:00.328 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 579970 00:28:00.328 [2024-07-12 13:53:48.662068] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:00.329 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:28:00.329 00:28:00.329 real 0m11.511s 00:28:00.329 user 0m21.084s 00:28:00.329 sys 0m2.082s 00:28:00.329 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:00.329 13:53:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:00.329 ************************************ 00:28:00.329 END TEST raid_state_function_test_sb_md_separate 00:28:00.329 ************************************ 00:28:00.588 13:53:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:00.588 13:53:48 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:28:00.588 13:53:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:00.588 13:53:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:00.588 13:53:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:00.588 ************************************ 00:28:00.588 START TEST raid_superblock_test_md_separate 00:28:00.588 ************************************ 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=581757 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 581757 /var/tmp/spdk-raid.sock 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 581757 ']' 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:00.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:00.588 13:53:48 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:00.588 [2024-07-12 13:53:49.031757] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:28:00.588 [2024-07-12 13:53:49.031823] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid581757 ] 00:28:00.588 [2024-07-12 13:53:49.160801] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:00.848 [2024-07-12 13:53:49.262970] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:00.848 [2024-07-12 13:53:49.318620] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:00.848 [2024-07-12 13:53:49.318647] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:01.416 13:53:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:28:01.675 malloc1 00:28:01.675 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:01.935 [2024-07-12 13:53:50.454584] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:01.935 [2024-07-12 13:53:50.454635] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:01.935 [2024-07-12 13:53:50.454658] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe86880 00:28:01.935 [2024-07-12 13:53:50.454671] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:01.935 [2024-07-12 13:53:50.456334] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:01.935 [2024-07-12 13:53:50.456379] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:01.935 pt1 00:28:01.935 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:01.935 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:01.935 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:01.935 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:01.935 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:01.935 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:01.935 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:01.935 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:01.935 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:28:02.194 malloc2 00:28:02.194 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:02.453 [2024-07-12 13:53:50.942906] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:02.453 [2024-07-12 13:53:50.942966] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:02.453 [2024-07-12 13:53:50.942986] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf98750 00:28:02.453 [2024-07-12 13:53:50.942999] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:02.453 [2024-07-12 13:53:50.944407] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:02.453 [2024-07-12 13:53:50.944435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:02.453 pt2 00:28:02.453 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:02.453 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:02.453 13:53:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:02.712 [2024-07-12 13:53:51.183557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:02.712 [2024-07-12 13:53:51.184877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:02.712 [2024-07-12 13:53:51.185043] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf9ad60 00:28:02.712 [2024-07-12 13:53:51.185057] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:02.712 [2024-07-12 13:53:51.185129] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf989e0 00:28:02.712 [2024-07-12 13:53:51.185247] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf9ad60 00:28:02.712 [2024-07-12 13:53:51.185257] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf9ad60 00:28:02.712 [2024-07-12 13:53:51.185330] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.712 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.971 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:02.971 "name": "raid_bdev1", 00:28:02.971 "uuid": "f6669ef6-bdf2-42a5-9260-060569af4857", 00:28:02.971 "strip_size_kb": 0, 00:28:02.971 "state": "online", 00:28:02.971 "raid_level": "raid1", 00:28:02.971 "superblock": true, 00:28:02.971 "num_base_bdevs": 2, 00:28:02.971 "num_base_bdevs_discovered": 2, 00:28:02.971 "num_base_bdevs_operational": 2, 00:28:02.971 "base_bdevs_list": [ 00:28:02.971 { 00:28:02.971 "name": "pt1", 00:28:02.971 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:02.971 "is_configured": true, 00:28:02.971 "data_offset": 256, 00:28:02.971 "data_size": 7936 00:28:02.971 }, 00:28:02.971 { 00:28:02.971 "name": "pt2", 00:28:02.971 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:02.971 "is_configured": true, 00:28:02.971 "data_offset": 256, 00:28:02.971 "data_size": 7936 00:28:02.971 } 00:28:02.971 ] 00:28:02.971 }' 00:28:02.971 13:53:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:02.971 13:53:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:03.539 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:03.539 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:03.539 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:03.539 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:03.539 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:03.539 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:03.539 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:03.539 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:03.799 [2024-07-12 13:53:52.194474] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:03.799 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:03.799 "name": "raid_bdev1", 00:28:03.799 "aliases": [ 00:28:03.799 "f6669ef6-bdf2-42a5-9260-060569af4857" 00:28:03.799 ], 00:28:03.799 "product_name": "Raid Volume", 00:28:03.799 "block_size": 4096, 00:28:03.799 "num_blocks": 7936, 00:28:03.799 "uuid": "f6669ef6-bdf2-42a5-9260-060569af4857", 00:28:03.799 "md_size": 32, 00:28:03.799 "md_interleave": false, 00:28:03.799 "dif_type": 0, 00:28:03.799 "assigned_rate_limits": { 00:28:03.799 "rw_ios_per_sec": 0, 00:28:03.799 "rw_mbytes_per_sec": 0, 00:28:03.799 "r_mbytes_per_sec": 0, 00:28:03.799 "w_mbytes_per_sec": 0 00:28:03.799 }, 00:28:03.799 "claimed": false, 00:28:03.799 "zoned": false, 00:28:03.799 "supported_io_types": { 00:28:03.799 "read": true, 00:28:03.799 "write": true, 00:28:03.799 "unmap": false, 00:28:03.799 "flush": false, 00:28:03.799 "reset": true, 00:28:03.799 "nvme_admin": false, 00:28:03.799 "nvme_io": false, 00:28:03.799 "nvme_io_md": false, 00:28:03.799 "write_zeroes": true, 00:28:03.799 "zcopy": false, 00:28:03.799 "get_zone_info": false, 00:28:03.799 "zone_management": false, 00:28:03.799 "zone_append": false, 00:28:03.799 "compare": false, 00:28:03.799 "compare_and_write": false, 00:28:03.799 "abort": false, 00:28:03.799 "seek_hole": false, 00:28:03.799 "seek_data": false, 00:28:03.799 "copy": false, 00:28:03.799 "nvme_iov_md": false 00:28:03.799 }, 00:28:03.799 "memory_domains": [ 00:28:03.799 { 00:28:03.799 "dma_device_id": "system", 00:28:03.799 "dma_device_type": 1 00:28:03.799 }, 00:28:03.799 { 00:28:03.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:03.799 "dma_device_type": 2 00:28:03.799 }, 00:28:03.799 { 00:28:03.799 "dma_device_id": "system", 00:28:03.799 "dma_device_type": 1 00:28:03.799 }, 00:28:03.799 { 00:28:03.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:03.799 "dma_device_type": 2 00:28:03.799 } 00:28:03.799 ], 00:28:03.799 "driver_specific": { 00:28:03.799 "raid": { 00:28:03.799 "uuid": "f6669ef6-bdf2-42a5-9260-060569af4857", 00:28:03.799 "strip_size_kb": 0, 00:28:03.799 "state": "online", 00:28:03.799 "raid_level": "raid1", 00:28:03.799 "superblock": true, 00:28:03.799 "num_base_bdevs": 2, 00:28:03.799 "num_base_bdevs_discovered": 2, 00:28:03.799 "num_base_bdevs_operational": 2, 00:28:03.799 "base_bdevs_list": [ 00:28:03.799 { 00:28:03.799 "name": "pt1", 00:28:03.799 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:03.799 "is_configured": true, 00:28:03.799 "data_offset": 256, 00:28:03.799 "data_size": 7936 00:28:03.799 }, 00:28:03.799 { 00:28:03.799 "name": "pt2", 00:28:03.799 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:03.799 "is_configured": true, 00:28:03.799 "data_offset": 256, 00:28:03.799 "data_size": 7936 00:28:03.799 } 00:28:03.799 ] 00:28:03.799 } 00:28:03.799 } 00:28:03.799 }' 00:28:03.799 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:03.799 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:03.799 pt2' 00:28:03.799 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:03.799 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:03.799 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:04.058 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:04.058 "name": "pt1", 00:28:04.058 "aliases": [ 00:28:04.058 "00000000-0000-0000-0000-000000000001" 00:28:04.058 ], 00:28:04.058 "product_name": "passthru", 00:28:04.058 "block_size": 4096, 00:28:04.058 "num_blocks": 8192, 00:28:04.058 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:04.058 "md_size": 32, 00:28:04.058 "md_interleave": false, 00:28:04.058 "dif_type": 0, 00:28:04.058 "assigned_rate_limits": { 00:28:04.058 "rw_ios_per_sec": 0, 00:28:04.058 "rw_mbytes_per_sec": 0, 00:28:04.058 "r_mbytes_per_sec": 0, 00:28:04.058 "w_mbytes_per_sec": 0 00:28:04.058 }, 00:28:04.058 "claimed": true, 00:28:04.058 "claim_type": "exclusive_write", 00:28:04.058 "zoned": false, 00:28:04.058 "supported_io_types": { 00:28:04.058 "read": true, 00:28:04.058 "write": true, 00:28:04.058 "unmap": true, 00:28:04.058 "flush": true, 00:28:04.058 "reset": true, 00:28:04.058 "nvme_admin": false, 00:28:04.058 "nvme_io": false, 00:28:04.058 "nvme_io_md": false, 00:28:04.058 "write_zeroes": true, 00:28:04.058 "zcopy": true, 00:28:04.058 "get_zone_info": false, 00:28:04.058 "zone_management": false, 00:28:04.058 "zone_append": false, 00:28:04.058 "compare": false, 00:28:04.058 "compare_and_write": false, 00:28:04.058 "abort": true, 00:28:04.058 "seek_hole": false, 00:28:04.058 "seek_data": false, 00:28:04.058 "copy": true, 00:28:04.058 "nvme_iov_md": false 00:28:04.058 }, 00:28:04.058 "memory_domains": [ 00:28:04.058 { 00:28:04.058 "dma_device_id": "system", 00:28:04.058 "dma_device_type": 1 00:28:04.058 }, 00:28:04.058 { 00:28:04.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:04.058 "dma_device_type": 2 00:28:04.058 } 00:28:04.058 ], 00:28:04.058 "driver_specific": { 00:28:04.058 "passthru": { 00:28:04.058 "name": "pt1", 00:28:04.058 "base_bdev_name": "malloc1" 00:28:04.058 } 00:28:04.058 } 00:28:04.058 }' 00:28:04.058 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.058 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.058 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:04.058 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:04.317 13:53:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:04.584 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:04.584 "name": "pt2", 00:28:04.584 "aliases": [ 00:28:04.584 "00000000-0000-0000-0000-000000000002" 00:28:04.584 ], 00:28:04.584 "product_name": "passthru", 00:28:04.584 "block_size": 4096, 00:28:04.584 "num_blocks": 8192, 00:28:04.584 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:04.584 "md_size": 32, 00:28:04.584 "md_interleave": false, 00:28:04.584 "dif_type": 0, 00:28:04.584 "assigned_rate_limits": { 00:28:04.584 "rw_ios_per_sec": 0, 00:28:04.584 "rw_mbytes_per_sec": 0, 00:28:04.584 "r_mbytes_per_sec": 0, 00:28:04.584 "w_mbytes_per_sec": 0 00:28:04.584 }, 00:28:04.584 "claimed": true, 00:28:04.584 "claim_type": "exclusive_write", 00:28:04.584 "zoned": false, 00:28:04.584 "supported_io_types": { 00:28:04.584 "read": true, 00:28:04.584 "write": true, 00:28:04.584 "unmap": true, 00:28:04.584 "flush": true, 00:28:04.584 "reset": true, 00:28:04.584 "nvme_admin": false, 00:28:04.584 "nvme_io": false, 00:28:04.584 "nvme_io_md": false, 00:28:04.584 "write_zeroes": true, 00:28:04.584 "zcopy": true, 00:28:04.584 "get_zone_info": false, 00:28:04.584 "zone_management": false, 00:28:04.584 "zone_append": false, 00:28:04.584 "compare": false, 00:28:04.584 "compare_and_write": false, 00:28:04.584 "abort": true, 00:28:04.584 "seek_hole": false, 00:28:04.584 "seek_data": false, 00:28:04.584 "copy": true, 00:28:04.584 "nvme_iov_md": false 00:28:04.584 }, 00:28:04.584 "memory_domains": [ 00:28:04.584 { 00:28:04.584 "dma_device_id": "system", 00:28:04.584 "dma_device_type": 1 00:28:04.584 }, 00:28:04.584 { 00:28:04.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:04.584 "dma_device_type": 2 00:28:04.584 } 00:28:04.584 ], 00:28:04.584 "driver_specific": { 00:28:04.584 "passthru": { 00:28:04.584 "name": "pt2", 00:28:04.584 "base_bdev_name": "malloc2" 00:28:04.584 } 00:28:04.584 } 00:28:04.584 }' 00:28:04.584 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.584 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.853 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:04.853 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.853 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.853 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:04.853 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.853 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.853 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:04.853 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:05.113 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:05.113 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:05.113 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:05.113 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:05.371 [2024-07-12 13:53:53.710490] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:05.371 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=f6669ef6-bdf2-42a5-9260-060569af4857 00:28:05.371 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z f6669ef6-bdf2-42a5-9260-060569af4857 ']' 00:28:05.371 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:05.630 [2024-07-12 13:53:53.954857] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:05.630 [2024-07-12 13:53:53.954877] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:05.630 [2024-07-12 13:53:53.954942] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:05.630 [2024-07-12 13:53:53.954995] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:05.630 [2024-07-12 13:53:53.955008] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf9ad60 name raid_bdev1, state offline 00:28:05.630 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.630 13:53:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:05.889 13:53:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:05.889 13:53:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:05.889 13:53:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:05.889 13:53:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:05.889 13:53:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:05.889 13:53:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:06.148 13:53:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:06.148 13:53:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:06.407 13:53:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:06.408 13:53:54 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:06.667 [2024-07-12 13:53:55.186222] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:06.667 [2024-07-12 13:53:55.187569] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:06.667 [2024-07-12 13:53:55.187621] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:06.667 [2024-07-12 13:53:55.187661] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:06.667 [2024-07-12 13:53:55.187680] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:06.667 [2024-07-12 13:53:55.187690] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf9afe0 name raid_bdev1, state configuring 00:28:06.667 request: 00:28:06.667 { 00:28:06.667 "name": "raid_bdev1", 00:28:06.667 "raid_level": "raid1", 00:28:06.667 "base_bdevs": [ 00:28:06.667 "malloc1", 00:28:06.667 "malloc2" 00:28:06.667 ], 00:28:06.667 "superblock": false, 00:28:06.667 "method": "bdev_raid_create", 00:28:06.667 "req_id": 1 00:28:06.667 } 00:28:06.667 Got JSON-RPC error response 00:28:06.667 response: 00:28:06.667 { 00:28:06.667 "code": -17, 00:28:06.667 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:06.667 } 00:28:06.667 13:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:06.667 13:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:06.667 13:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:06.667 13:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:06.667 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.667 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:06.926 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:06.926 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:06.926 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:07.185 [2024-07-12 13:53:55.679467] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:07.185 [2024-07-12 13:53:55.679513] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:07.185 [2024-07-12 13:53:55.679532] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe86ab0 00:28:07.185 [2024-07-12 13:53:55.679544] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:07.185 [2024-07-12 13:53:55.681059] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:07.185 [2024-07-12 13:53:55.681088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:07.185 [2024-07-12 13:53:55.681132] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:07.185 [2024-07-12 13:53:55.681156] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:07.185 pt1 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.185 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.444 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:07.444 "name": "raid_bdev1", 00:28:07.444 "uuid": "f6669ef6-bdf2-42a5-9260-060569af4857", 00:28:07.444 "strip_size_kb": 0, 00:28:07.444 "state": "configuring", 00:28:07.444 "raid_level": "raid1", 00:28:07.444 "superblock": true, 00:28:07.444 "num_base_bdevs": 2, 00:28:07.444 "num_base_bdevs_discovered": 1, 00:28:07.444 "num_base_bdevs_operational": 2, 00:28:07.444 "base_bdevs_list": [ 00:28:07.444 { 00:28:07.444 "name": "pt1", 00:28:07.444 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:07.444 "is_configured": true, 00:28:07.444 "data_offset": 256, 00:28:07.444 "data_size": 7936 00:28:07.444 }, 00:28:07.444 { 00:28:07.444 "name": null, 00:28:07.444 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:07.444 "is_configured": false, 00:28:07.444 "data_offset": 256, 00:28:07.444 "data_size": 7936 00:28:07.444 } 00:28:07.444 ] 00:28:07.444 }' 00:28:07.444 13:53:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:07.444 13:53:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:08.011 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:08.011 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:08.011 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:08.011 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:08.271 [2024-07-12 13:53:56.818480] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:08.271 [2024-07-12 13:53:56.818529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:08.271 [2024-07-12 13:53:56.818551] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf9c7b0 00:28:08.271 [2024-07-12 13:53:56.818564] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:08.271 [2024-07-12 13:53:56.818761] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:08.271 [2024-07-12 13:53:56.818779] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:08.271 [2024-07-12 13:53:56.818823] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:08.271 [2024-07-12 13:53:56.818847] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:08.271 [2024-07-12 13:53:56.818949] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe05120 00:28:08.271 [2024-07-12 13:53:56.818961] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:08.271 [2024-07-12 13:53:56.819015] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfa0f40 00:28:08.271 [2024-07-12 13:53:56.819116] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe05120 00:28:08.271 [2024-07-12 13:53:56.819126] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe05120 00:28:08.271 [2024-07-12 13:53:56.819195] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:08.271 pt2 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.271 13:53:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:08.530 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:08.530 "name": "raid_bdev1", 00:28:08.530 "uuid": "f6669ef6-bdf2-42a5-9260-060569af4857", 00:28:08.530 "strip_size_kb": 0, 00:28:08.530 "state": "online", 00:28:08.530 "raid_level": "raid1", 00:28:08.530 "superblock": true, 00:28:08.530 "num_base_bdevs": 2, 00:28:08.530 "num_base_bdevs_discovered": 2, 00:28:08.530 "num_base_bdevs_operational": 2, 00:28:08.530 "base_bdevs_list": [ 00:28:08.530 { 00:28:08.530 "name": "pt1", 00:28:08.530 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:08.530 "is_configured": true, 00:28:08.530 "data_offset": 256, 00:28:08.530 "data_size": 7936 00:28:08.530 }, 00:28:08.530 { 00:28:08.530 "name": "pt2", 00:28:08.530 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:08.530 "is_configured": true, 00:28:08.530 "data_offset": 256, 00:28:08.530 "data_size": 7936 00:28:08.530 } 00:28:08.530 ] 00:28:08.530 }' 00:28:08.530 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:08.530 13:53:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:09.467 [2024-07-12 13:53:57.865534] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:09.467 "name": "raid_bdev1", 00:28:09.467 "aliases": [ 00:28:09.467 "f6669ef6-bdf2-42a5-9260-060569af4857" 00:28:09.467 ], 00:28:09.467 "product_name": "Raid Volume", 00:28:09.467 "block_size": 4096, 00:28:09.467 "num_blocks": 7936, 00:28:09.467 "uuid": "f6669ef6-bdf2-42a5-9260-060569af4857", 00:28:09.467 "md_size": 32, 00:28:09.467 "md_interleave": false, 00:28:09.467 "dif_type": 0, 00:28:09.467 "assigned_rate_limits": { 00:28:09.467 "rw_ios_per_sec": 0, 00:28:09.467 "rw_mbytes_per_sec": 0, 00:28:09.467 "r_mbytes_per_sec": 0, 00:28:09.467 "w_mbytes_per_sec": 0 00:28:09.467 }, 00:28:09.467 "claimed": false, 00:28:09.467 "zoned": false, 00:28:09.467 "supported_io_types": { 00:28:09.467 "read": true, 00:28:09.467 "write": true, 00:28:09.467 "unmap": false, 00:28:09.467 "flush": false, 00:28:09.467 "reset": true, 00:28:09.467 "nvme_admin": false, 00:28:09.467 "nvme_io": false, 00:28:09.467 "nvme_io_md": false, 00:28:09.467 "write_zeroes": true, 00:28:09.467 "zcopy": false, 00:28:09.467 "get_zone_info": false, 00:28:09.467 "zone_management": false, 00:28:09.467 "zone_append": false, 00:28:09.467 "compare": false, 00:28:09.467 "compare_and_write": false, 00:28:09.467 "abort": false, 00:28:09.467 "seek_hole": false, 00:28:09.467 "seek_data": false, 00:28:09.467 "copy": false, 00:28:09.467 "nvme_iov_md": false 00:28:09.467 }, 00:28:09.467 "memory_domains": [ 00:28:09.467 { 00:28:09.467 "dma_device_id": "system", 00:28:09.467 "dma_device_type": 1 00:28:09.467 }, 00:28:09.467 { 00:28:09.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.467 "dma_device_type": 2 00:28:09.467 }, 00:28:09.467 { 00:28:09.467 "dma_device_id": "system", 00:28:09.467 "dma_device_type": 1 00:28:09.467 }, 00:28:09.467 { 00:28:09.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.467 "dma_device_type": 2 00:28:09.467 } 00:28:09.467 ], 00:28:09.467 "driver_specific": { 00:28:09.467 "raid": { 00:28:09.467 "uuid": "f6669ef6-bdf2-42a5-9260-060569af4857", 00:28:09.467 "strip_size_kb": 0, 00:28:09.467 "state": "online", 00:28:09.467 "raid_level": "raid1", 00:28:09.467 "superblock": true, 00:28:09.467 "num_base_bdevs": 2, 00:28:09.467 "num_base_bdevs_discovered": 2, 00:28:09.467 "num_base_bdevs_operational": 2, 00:28:09.467 "base_bdevs_list": [ 00:28:09.467 { 00:28:09.467 "name": "pt1", 00:28:09.467 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:09.467 "is_configured": true, 00:28:09.467 "data_offset": 256, 00:28:09.467 "data_size": 7936 00:28:09.467 }, 00:28:09.467 { 00:28:09.467 "name": "pt2", 00:28:09.467 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:09.467 "is_configured": true, 00:28:09.467 "data_offset": 256, 00:28:09.467 "data_size": 7936 00:28:09.467 } 00:28:09.467 ] 00:28:09.467 } 00:28:09.467 } 00:28:09.467 }' 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:09.467 pt2' 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:09.467 13:53:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:09.727 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:09.727 "name": "pt1", 00:28:09.727 "aliases": [ 00:28:09.727 "00000000-0000-0000-0000-000000000001" 00:28:09.727 ], 00:28:09.727 "product_name": "passthru", 00:28:09.727 "block_size": 4096, 00:28:09.727 "num_blocks": 8192, 00:28:09.727 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:09.727 "md_size": 32, 00:28:09.727 "md_interleave": false, 00:28:09.727 "dif_type": 0, 00:28:09.727 "assigned_rate_limits": { 00:28:09.727 "rw_ios_per_sec": 0, 00:28:09.727 "rw_mbytes_per_sec": 0, 00:28:09.727 "r_mbytes_per_sec": 0, 00:28:09.727 "w_mbytes_per_sec": 0 00:28:09.727 }, 00:28:09.727 "claimed": true, 00:28:09.727 "claim_type": "exclusive_write", 00:28:09.727 "zoned": false, 00:28:09.727 "supported_io_types": { 00:28:09.727 "read": true, 00:28:09.727 "write": true, 00:28:09.727 "unmap": true, 00:28:09.727 "flush": true, 00:28:09.727 "reset": true, 00:28:09.727 "nvme_admin": false, 00:28:09.727 "nvme_io": false, 00:28:09.727 "nvme_io_md": false, 00:28:09.727 "write_zeroes": true, 00:28:09.727 "zcopy": true, 00:28:09.727 "get_zone_info": false, 00:28:09.727 "zone_management": false, 00:28:09.727 "zone_append": false, 00:28:09.727 "compare": false, 00:28:09.727 "compare_and_write": false, 00:28:09.727 "abort": true, 00:28:09.727 "seek_hole": false, 00:28:09.727 "seek_data": false, 00:28:09.727 "copy": true, 00:28:09.727 "nvme_iov_md": false 00:28:09.727 }, 00:28:09.727 "memory_domains": [ 00:28:09.727 { 00:28:09.727 "dma_device_id": "system", 00:28:09.727 "dma_device_type": 1 00:28:09.727 }, 00:28:09.727 { 00:28:09.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.727 "dma_device_type": 2 00:28:09.727 } 00:28:09.727 ], 00:28:09.727 "driver_specific": { 00:28:09.727 "passthru": { 00:28:09.727 "name": "pt1", 00:28:09.727 "base_bdev_name": "malloc1" 00:28:09.727 } 00:28:09.727 } 00:28:09.727 }' 00:28:09.727 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:09.727 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:09.727 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:09.727 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:09.987 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:09.987 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:09.987 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:09.987 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:09.987 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:09.987 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:09.987 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.246 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:10.246 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:10.246 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:10.246 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:10.506 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:10.506 "name": "pt2", 00:28:10.506 "aliases": [ 00:28:10.506 "00000000-0000-0000-0000-000000000002" 00:28:10.506 ], 00:28:10.506 "product_name": "passthru", 00:28:10.506 "block_size": 4096, 00:28:10.506 "num_blocks": 8192, 00:28:10.506 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:10.506 "md_size": 32, 00:28:10.506 "md_interleave": false, 00:28:10.506 "dif_type": 0, 00:28:10.506 "assigned_rate_limits": { 00:28:10.506 "rw_ios_per_sec": 0, 00:28:10.506 "rw_mbytes_per_sec": 0, 00:28:10.506 "r_mbytes_per_sec": 0, 00:28:10.506 "w_mbytes_per_sec": 0 00:28:10.506 }, 00:28:10.507 "claimed": true, 00:28:10.507 "claim_type": "exclusive_write", 00:28:10.507 "zoned": false, 00:28:10.507 "supported_io_types": { 00:28:10.507 "read": true, 00:28:10.507 "write": true, 00:28:10.507 "unmap": true, 00:28:10.507 "flush": true, 00:28:10.507 "reset": true, 00:28:10.507 "nvme_admin": false, 00:28:10.507 "nvme_io": false, 00:28:10.507 "nvme_io_md": false, 00:28:10.507 "write_zeroes": true, 00:28:10.507 "zcopy": true, 00:28:10.507 "get_zone_info": false, 00:28:10.507 "zone_management": false, 00:28:10.507 "zone_append": false, 00:28:10.507 "compare": false, 00:28:10.507 "compare_and_write": false, 00:28:10.507 "abort": true, 00:28:10.507 "seek_hole": false, 00:28:10.507 "seek_data": false, 00:28:10.507 "copy": true, 00:28:10.507 "nvme_iov_md": false 00:28:10.507 }, 00:28:10.507 "memory_domains": [ 00:28:10.507 { 00:28:10.507 "dma_device_id": "system", 00:28:10.507 "dma_device_type": 1 00:28:10.507 }, 00:28:10.507 { 00:28:10.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:10.507 "dma_device_type": 2 00:28:10.507 } 00:28:10.507 ], 00:28:10.507 "driver_specific": { 00:28:10.507 "passthru": { 00:28:10.507 "name": "pt2", 00:28:10.507 "base_bdev_name": "malloc2" 00:28:10.507 } 00:28:10.507 } 00:28:10.507 }' 00:28:10.507 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:10.507 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:10.507 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:10.507 13:53:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:10.507 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:10.507 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:10.507 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:10.767 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:10.767 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:10.767 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.767 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.767 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:10.767 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:10.767 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:11.026 [2024-07-12 13:53:59.493851] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:11.026 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' f6669ef6-bdf2-42a5-9260-060569af4857 '!=' f6669ef6-bdf2-42a5-9260-060569af4857 ']' 00:28:11.026 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:11.026 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:11.026 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:11.026 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:11.286 [2024-07-12 13:53:59.738251] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.286 13:53:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:11.545 13:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:11.545 "name": "raid_bdev1", 00:28:11.545 "uuid": "f6669ef6-bdf2-42a5-9260-060569af4857", 00:28:11.545 "strip_size_kb": 0, 00:28:11.545 "state": "online", 00:28:11.545 "raid_level": "raid1", 00:28:11.545 "superblock": true, 00:28:11.545 "num_base_bdevs": 2, 00:28:11.545 "num_base_bdevs_discovered": 1, 00:28:11.545 "num_base_bdevs_operational": 1, 00:28:11.545 "base_bdevs_list": [ 00:28:11.545 { 00:28:11.545 "name": null, 00:28:11.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:11.545 "is_configured": false, 00:28:11.545 "data_offset": 256, 00:28:11.545 "data_size": 7936 00:28:11.545 }, 00:28:11.545 { 00:28:11.545 "name": "pt2", 00:28:11.545 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:11.545 "is_configured": true, 00:28:11.545 "data_offset": 256, 00:28:11.545 "data_size": 7936 00:28:11.545 } 00:28:11.545 ] 00:28:11.545 }' 00:28:11.545 13:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:11.545 13:54:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:12.114 13:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:12.374 [2024-07-12 13:54:00.841138] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:12.374 [2024-07-12 13:54:00.841163] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:12.374 [2024-07-12 13:54:00.841219] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:12.374 [2024-07-12 13:54:00.841262] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:12.374 [2024-07-12 13:54:00.841275] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe05120 name raid_bdev1, state offline 00:28:12.374 13:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:12.374 13:54:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.634 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:12.634 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:12.635 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:12.635 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:12.635 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:12.893 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:12.893 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:12.893 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:12.893 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:12.893 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:28:12.893 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:13.150 [2024-07-12 13:54:01.591078] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:13.150 [2024-07-12 13:54:01.591128] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:13.150 [2024-07-12 13:54:01.591148] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf9bf40 00:28:13.150 [2024-07-12 13:54:01.591162] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:13.150 [2024-07-12 13:54:01.592626] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:13.150 [2024-07-12 13:54:01.592657] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:13.150 [2024-07-12 13:54:01.592706] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:13.150 [2024-07-12 13:54:01.592730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:13.150 [2024-07-12 13:54:01.592808] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfa0aa0 00:28:13.150 [2024-07-12 13:54:01.592819] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:13.150 [2024-07-12 13:54:01.592878] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfa07e0 00:28:13.150 [2024-07-12 13:54:01.592995] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfa0aa0 00:28:13.150 [2024-07-12 13:54:01.593006] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfa0aa0 00:28:13.150 [2024-07-12 13:54:01.593074] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:13.150 pt2 00:28:13.150 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:13.150 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:13.150 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:13.150 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:13.151 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:13.151 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:13.151 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:13.151 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:13.151 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:13.151 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:13.151 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.151 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.519 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:13.519 "name": "raid_bdev1", 00:28:13.519 "uuid": "f6669ef6-bdf2-42a5-9260-060569af4857", 00:28:13.519 "strip_size_kb": 0, 00:28:13.519 "state": "online", 00:28:13.519 "raid_level": "raid1", 00:28:13.519 "superblock": true, 00:28:13.519 "num_base_bdevs": 2, 00:28:13.519 "num_base_bdevs_discovered": 1, 00:28:13.519 "num_base_bdevs_operational": 1, 00:28:13.519 "base_bdevs_list": [ 00:28:13.519 { 00:28:13.519 "name": null, 00:28:13.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:13.519 "is_configured": false, 00:28:13.519 "data_offset": 256, 00:28:13.519 "data_size": 7936 00:28:13.519 }, 00:28:13.519 { 00:28:13.519 "name": "pt2", 00:28:13.519 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:13.519 "is_configured": true, 00:28:13.519 "data_offset": 256, 00:28:13.519 "data_size": 7936 00:28:13.519 } 00:28:13.519 ] 00:28:13.519 }' 00:28:13.519 13:54:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:13.519 13:54:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:14.084 13:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:14.342 [2024-07-12 13:54:02.770215] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:14.342 [2024-07-12 13:54:02.770241] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:14.342 [2024-07-12 13:54:02.770298] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:14.342 [2024-07-12 13:54:02.770341] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:14.342 [2024-07-12 13:54:02.770353] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfa0aa0 name raid_bdev1, state offline 00:28:14.342 13:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.342 13:54:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:14.601 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:14.601 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:14.601 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:14.601 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:14.859 [2024-07-12 13:54:03.275533] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:14.859 [2024-07-12 13:54:03.275585] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:14.859 [2024-07-12 13:54:03.275604] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfa0d20 00:28:14.859 [2024-07-12 13:54:03.275617] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:14.859 [2024-07-12 13:54:03.277097] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:14.860 [2024-07-12 13:54:03.277125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:14.860 [2024-07-12 13:54:03.277173] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:14.860 [2024-07-12 13:54:03.277197] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:14.860 [2024-07-12 13:54:03.277287] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:14.860 [2024-07-12 13:54:03.277300] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:14.860 [2024-07-12 13:54:03.277314] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf9fc70 name raid_bdev1, state configuring 00:28:14.860 [2024-07-12 13:54:03.277337] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:14.860 [2024-07-12 13:54:03.277386] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf9d3b0 00:28:14.860 [2024-07-12 13:54:03.277397] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:14.860 [2024-07-12 13:54:03.277450] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf9ded0 00:28:14.860 [2024-07-12 13:54:03.277543] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf9d3b0 00:28:14.860 [2024-07-12 13:54:03.277553] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf9d3b0 00:28:14.860 [2024-07-12 13:54:03.277620] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:14.860 pt1 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:14.860 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.118 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:15.118 "name": "raid_bdev1", 00:28:15.118 "uuid": "f6669ef6-bdf2-42a5-9260-060569af4857", 00:28:15.118 "strip_size_kb": 0, 00:28:15.118 "state": "online", 00:28:15.118 "raid_level": "raid1", 00:28:15.118 "superblock": true, 00:28:15.118 "num_base_bdevs": 2, 00:28:15.118 "num_base_bdevs_discovered": 1, 00:28:15.118 "num_base_bdevs_operational": 1, 00:28:15.118 "base_bdevs_list": [ 00:28:15.118 { 00:28:15.118 "name": null, 00:28:15.118 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.118 "is_configured": false, 00:28:15.118 "data_offset": 256, 00:28:15.118 "data_size": 7936 00:28:15.118 }, 00:28:15.118 { 00:28:15.118 "name": "pt2", 00:28:15.118 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:15.118 "is_configured": true, 00:28:15.118 "data_offset": 256, 00:28:15.118 "data_size": 7936 00:28:15.118 } 00:28:15.118 ] 00:28:15.118 }' 00:28:15.118 13:54:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:15.118 13:54:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:15.684 13:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:15.684 13:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:15.942 13:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:15.942 13:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:15.942 13:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:16.201 [2024-07-12 13:54:04.631363] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' f6669ef6-bdf2-42a5-9260-060569af4857 '!=' f6669ef6-bdf2-42a5-9260-060569af4857 ']' 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 581757 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 581757 ']' 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 581757 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 581757 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 581757' 00:28:16.201 killing process with pid 581757 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 581757 00:28:16.201 [2024-07-12 13:54:04.703053] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:16.201 [2024-07-12 13:54:04.703110] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:16.201 [2024-07-12 13:54:04.703152] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:16.201 [2024-07-12 13:54:04.703165] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf9d3b0 name raid_bdev1, state offline 00:28:16.201 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 581757 00:28:16.201 [2024-07-12 13:54:04.727212] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:16.460 13:54:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:28:16.460 00:28:16.460 real 0m15.960s 00:28:16.460 user 0m29.139s 00:28:16.460 sys 0m2.879s 00:28:16.460 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:16.460 13:54:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:16.460 ************************************ 00:28:16.460 END TEST raid_superblock_test_md_separate 00:28:16.460 ************************************ 00:28:16.460 13:54:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:16.460 13:54:04 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:28:16.460 13:54:04 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:28:16.460 13:54:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:16.460 13:54:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:16.460 13:54:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:16.460 ************************************ 00:28:16.460 START TEST raid_rebuild_test_sb_md_separate 00:28:16.460 ************************************ 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=584391 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 584391 /var/tmp/spdk-raid.sock 00:28:16.460 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:16.461 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 584391 ']' 00:28:16.461 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:16.461 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:16.461 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:16.461 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:16.461 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:16.461 13:54:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:16.719 [2024-07-12 13:54:05.087757] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:28:16.719 [2024-07-12 13:54:05.087823] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid584391 ] 00:28:16.719 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:16.719 Zero copy mechanism will not be used. 00:28:16.719 [2024-07-12 13:54:05.210625] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:16.977 [2024-07-12 13:54:05.318453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:16.977 [2024-07-12 13:54:05.374383] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:16.977 [2024-07-12 13:54:05.374416] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:17.545 13:54:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:17.546 13:54:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:17.546 13:54:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:17.546 13:54:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:28:17.804 BaseBdev1_malloc 00:28:17.804 13:54:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:18.063 [2024-07-12 13:54:06.480903] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:18.063 [2024-07-12 13:54:06.480962] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:18.063 [2024-07-12 13:54:06.480987] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16eb010 00:28:18.063 [2024-07-12 13:54:06.481000] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:18.063 [2024-07-12 13:54:06.482395] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:18.063 [2024-07-12 13:54:06.482426] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:18.063 BaseBdev1 00:28:18.063 13:54:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:18.063 13:54:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:28:18.321 BaseBdev2_malloc 00:28:18.321 13:54:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:18.579 [2024-07-12 13:54:06.975699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:18.579 [2024-07-12 13:54:06.975747] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:18.579 [2024-07-12 13:54:06.975770] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1842b30 00:28:18.579 [2024-07-12 13:54:06.975783] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:18.579 [2024-07-12 13:54:06.977075] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:18.579 [2024-07-12 13:54:06.977102] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:18.579 BaseBdev2 00:28:18.579 13:54:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:28:18.838 spare_malloc 00:28:18.838 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:19.097 spare_delay 00:28:19.097 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:19.356 [2024-07-12 13:54:07.723068] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:19.356 [2024-07-12 13:54:07.723117] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:19.356 [2024-07-12 13:54:07.723147] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x183f0e0 00:28:19.356 [2024-07-12 13:54:07.723160] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:19.356 [2024-07-12 13:54:07.724469] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:19.356 [2024-07-12 13:54:07.724499] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:19.356 spare 00:28:19.357 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:19.616 [2024-07-12 13:54:07.971753] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:19.616 [2024-07-12 13:54:07.972985] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:19.616 [2024-07-12 13:54:07.973143] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x183fb00 00:28:19.616 [2024-07-12 13:54:07.973157] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:19.616 [2024-07-12 13:54:07.973230] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1750ca0 00:28:19.616 [2024-07-12 13:54:07.973340] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x183fb00 00:28:19.616 [2024-07-12 13:54:07.973351] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x183fb00 00:28:19.616 [2024-07-12 13:54:07.973419] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:19.616 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:19.616 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:19.616 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:19.616 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:19.616 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:19.616 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:19.616 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:19.616 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:19.616 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:19.616 13:54:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:19.616 13:54:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.616 13:54:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.616 13:54:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:19.616 "name": "raid_bdev1", 00:28:19.616 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:19.616 "strip_size_kb": 0, 00:28:19.616 "state": "online", 00:28:19.616 "raid_level": "raid1", 00:28:19.616 "superblock": true, 00:28:19.616 "num_base_bdevs": 2, 00:28:19.616 "num_base_bdevs_discovered": 2, 00:28:19.616 "num_base_bdevs_operational": 2, 00:28:19.616 "base_bdevs_list": [ 00:28:19.616 { 00:28:19.616 "name": "BaseBdev1", 00:28:19.616 "uuid": "ee33e5c0-96c9-5546-bab9-c47be7708576", 00:28:19.616 "is_configured": true, 00:28:19.616 "data_offset": 256, 00:28:19.616 "data_size": 7936 00:28:19.616 }, 00:28:19.616 { 00:28:19.616 "name": "BaseBdev2", 00:28:19.616 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:19.616 "is_configured": true, 00:28:19.616 "data_offset": 256, 00:28:19.616 "data_size": 7936 00:28:19.616 } 00:28:19.616 ] 00:28:19.616 }' 00:28:19.617 13:54:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:19.617 13:54:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:20.554 13:54:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:20.554 13:54:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:20.554 [2024-07-12 13:54:09.010760] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:20.554 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:20.554 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.554 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:20.814 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:21.074 [2024-07-12 13:54:09.507847] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1750ca0 00:28:21.074 /dev/nbd0 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:21.074 1+0 records in 00:28:21.074 1+0 records out 00:28:21.074 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236418 s, 17.3 MB/s 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:28:21.074 13:54:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:22.009 7936+0 records in 00:28:22.009 7936+0 records out 00:28:22.009 32505856 bytes (33 MB, 31 MiB) copied, 0.754463 s, 43.1 MB/s 00:28:22.009 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:22.009 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:22.009 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:22.009 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:22.009 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:22.009 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:22.009 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:22.267 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:22.267 [2024-07-12 13:54:10.609190] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:22.267 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:22.267 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:22.267 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:22.267 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:22.267 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:22.267 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:22.267 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:22.267 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:22.267 [2024-07-12 13:54:10.841854] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.524 13:54:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:22.782 13:54:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:22.782 "name": "raid_bdev1", 00:28:22.782 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:22.782 "strip_size_kb": 0, 00:28:22.782 "state": "online", 00:28:22.782 "raid_level": "raid1", 00:28:22.782 "superblock": true, 00:28:22.782 "num_base_bdevs": 2, 00:28:22.782 "num_base_bdevs_discovered": 1, 00:28:22.782 "num_base_bdevs_operational": 1, 00:28:22.782 "base_bdevs_list": [ 00:28:22.782 { 00:28:22.782 "name": null, 00:28:22.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:22.782 "is_configured": false, 00:28:22.782 "data_offset": 256, 00:28:22.782 "data_size": 7936 00:28:22.782 }, 00:28:22.782 { 00:28:22.782 "name": "BaseBdev2", 00:28:22.782 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:22.782 "is_configured": true, 00:28:22.782 "data_offset": 256, 00:28:22.782 "data_size": 7936 00:28:22.782 } 00:28:22.782 ] 00:28:22.782 }' 00:28:22.782 13:54:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:22.782 13:54:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:23.404 13:54:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:23.404 [2024-07-12 13:54:11.952809] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:23.404 [2024-07-12 13:54:11.955170] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e9c90 00:28:23.404 [2024-07-12 13:54:11.957471] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:23.404 13:54:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:24.778 13:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:24.778 13:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:24.778 13:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:24.778 13:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:24.778 13:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:24.778 13:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.778 13:54:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:24.778 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:24.778 "name": "raid_bdev1", 00:28:24.778 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:24.778 "strip_size_kb": 0, 00:28:24.778 "state": "online", 00:28:24.778 "raid_level": "raid1", 00:28:24.778 "superblock": true, 00:28:24.778 "num_base_bdevs": 2, 00:28:24.778 "num_base_bdevs_discovered": 2, 00:28:24.778 "num_base_bdevs_operational": 2, 00:28:24.778 "process": { 00:28:24.778 "type": "rebuild", 00:28:24.778 "target": "spare", 00:28:24.778 "progress": { 00:28:24.778 "blocks": 3072, 00:28:24.778 "percent": 38 00:28:24.778 } 00:28:24.778 }, 00:28:24.778 "base_bdevs_list": [ 00:28:24.778 { 00:28:24.778 "name": "spare", 00:28:24.778 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:24.778 "is_configured": true, 00:28:24.778 "data_offset": 256, 00:28:24.778 "data_size": 7936 00:28:24.778 }, 00:28:24.778 { 00:28:24.778 "name": "BaseBdev2", 00:28:24.778 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:24.778 "is_configured": true, 00:28:24.778 "data_offset": 256, 00:28:24.778 "data_size": 7936 00:28:24.778 } 00:28:24.778 ] 00:28:24.778 }' 00:28:24.778 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:24.778 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:24.778 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:24.778 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:24.778 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:25.036 [2024-07-12 13:54:13.558493] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:25.036 [2024-07-12 13:54:13.570342] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:25.036 [2024-07-12 13:54:13.570389] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:25.036 [2024-07-12 13:54:13.570405] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:25.036 [2024-07-12 13:54:13.570414] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.036 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:25.295 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:25.295 "name": "raid_bdev1", 00:28:25.295 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:25.295 "strip_size_kb": 0, 00:28:25.295 "state": "online", 00:28:25.295 "raid_level": "raid1", 00:28:25.295 "superblock": true, 00:28:25.295 "num_base_bdevs": 2, 00:28:25.295 "num_base_bdevs_discovered": 1, 00:28:25.295 "num_base_bdevs_operational": 1, 00:28:25.295 "base_bdevs_list": [ 00:28:25.295 { 00:28:25.295 "name": null, 00:28:25.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:25.295 "is_configured": false, 00:28:25.295 "data_offset": 256, 00:28:25.295 "data_size": 7936 00:28:25.295 }, 00:28:25.295 { 00:28:25.295 "name": "BaseBdev2", 00:28:25.295 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:25.295 "is_configured": true, 00:28:25.295 "data_offset": 256, 00:28:25.295 "data_size": 7936 00:28:25.295 } 00:28:25.295 ] 00:28:25.295 }' 00:28:25.295 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:25.295 13:54:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:25.862 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:25.862 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:25.862 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:25.862 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:25.862 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:25.862 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.862 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.121 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:26.121 "name": "raid_bdev1", 00:28:26.121 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:26.121 "strip_size_kb": 0, 00:28:26.121 "state": "online", 00:28:26.121 "raid_level": "raid1", 00:28:26.121 "superblock": true, 00:28:26.121 "num_base_bdevs": 2, 00:28:26.121 "num_base_bdevs_discovered": 1, 00:28:26.121 "num_base_bdevs_operational": 1, 00:28:26.121 "base_bdevs_list": [ 00:28:26.121 { 00:28:26.121 "name": null, 00:28:26.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.121 "is_configured": false, 00:28:26.121 "data_offset": 256, 00:28:26.121 "data_size": 7936 00:28:26.121 }, 00:28:26.121 { 00:28:26.121 "name": "BaseBdev2", 00:28:26.121 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:26.121 "is_configured": true, 00:28:26.121 "data_offset": 256, 00:28:26.121 "data_size": 7936 00:28:26.121 } 00:28:26.121 ] 00:28:26.121 }' 00:28:26.121 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:26.380 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:26.380 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:26.380 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:26.380 13:54:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:26.639 [2024-07-12 13:54:15.025390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:26.639 [2024-07-12 13:54:15.028024] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16eabc0 00:28:26.639 [2024-07-12 13:54:15.029620] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:26.639 13:54:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:27.576 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:27.576 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:27.576 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:27.576 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:27.576 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:27.576 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.576 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:27.835 "name": "raid_bdev1", 00:28:27.835 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:27.835 "strip_size_kb": 0, 00:28:27.835 "state": "online", 00:28:27.835 "raid_level": "raid1", 00:28:27.835 "superblock": true, 00:28:27.835 "num_base_bdevs": 2, 00:28:27.835 "num_base_bdevs_discovered": 2, 00:28:27.835 "num_base_bdevs_operational": 2, 00:28:27.835 "process": { 00:28:27.835 "type": "rebuild", 00:28:27.835 "target": "spare", 00:28:27.835 "progress": { 00:28:27.835 "blocks": 3072, 00:28:27.835 "percent": 38 00:28:27.835 } 00:28:27.835 }, 00:28:27.835 "base_bdevs_list": [ 00:28:27.835 { 00:28:27.835 "name": "spare", 00:28:27.835 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:27.835 "is_configured": true, 00:28:27.835 "data_offset": 256, 00:28:27.835 "data_size": 7936 00:28:27.835 }, 00:28:27.835 { 00:28:27.835 "name": "BaseBdev2", 00:28:27.835 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:27.835 "is_configured": true, 00:28:27.835 "data_offset": 256, 00:28:27.835 "data_size": 7936 00:28:27.835 } 00:28:27.835 ] 00:28:27.835 }' 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:27.835 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1120 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.835 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:28.094 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:28.094 "name": "raid_bdev1", 00:28:28.094 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:28.094 "strip_size_kb": 0, 00:28:28.094 "state": "online", 00:28:28.094 "raid_level": "raid1", 00:28:28.094 "superblock": true, 00:28:28.094 "num_base_bdevs": 2, 00:28:28.094 "num_base_bdevs_discovered": 2, 00:28:28.094 "num_base_bdevs_operational": 2, 00:28:28.094 "process": { 00:28:28.094 "type": "rebuild", 00:28:28.094 "target": "spare", 00:28:28.094 "progress": { 00:28:28.094 "blocks": 3840, 00:28:28.094 "percent": 48 00:28:28.094 } 00:28:28.094 }, 00:28:28.094 "base_bdevs_list": [ 00:28:28.094 { 00:28:28.094 "name": "spare", 00:28:28.094 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:28.094 "is_configured": true, 00:28:28.094 "data_offset": 256, 00:28:28.094 "data_size": 7936 00:28:28.094 }, 00:28:28.094 { 00:28:28.094 "name": "BaseBdev2", 00:28:28.094 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:28.094 "is_configured": true, 00:28:28.094 "data_offset": 256, 00:28:28.094 "data_size": 7936 00:28:28.094 } 00:28:28.094 ] 00:28:28.094 }' 00:28:28.094 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:28.094 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:28.094 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:28.353 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:28.353 13:54:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:29.289 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:29.289 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:29.289 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:29.289 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:29.289 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:29.289 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:29.289 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.289 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.548 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:29.548 "name": "raid_bdev1", 00:28:29.548 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:29.548 "strip_size_kb": 0, 00:28:29.548 "state": "online", 00:28:29.548 "raid_level": "raid1", 00:28:29.548 "superblock": true, 00:28:29.548 "num_base_bdevs": 2, 00:28:29.548 "num_base_bdevs_discovered": 2, 00:28:29.548 "num_base_bdevs_operational": 2, 00:28:29.548 "process": { 00:28:29.548 "type": "rebuild", 00:28:29.548 "target": "spare", 00:28:29.548 "progress": { 00:28:29.548 "blocks": 7168, 00:28:29.548 "percent": 90 00:28:29.548 } 00:28:29.548 }, 00:28:29.548 "base_bdevs_list": [ 00:28:29.548 { 00:28:29.548 "name": "spare", 00:28:29.548 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:29.548 "is_configured": true, 00:28:29.548 "data_offset": 256, 00:28:29.548 "data_size": 7936 00:28:29.548 }, 00:28:29.548 { 00:28:29.548 "name": "BaseBdev2", 00:28:29.548 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:29.548 "is_configured": true, 00:28:29.548 "data_offset": 256, 00:28:29.548 "data_size": 7936 00:28:29.548 } 00:28:29.548 ] 00:28:29.548 }' 00:28:29.548 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:29.548 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:29.548 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:29.548 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:29.548 13:54:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:29.807 [2024-07-12 13:54:18.154040] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:29.807 [2024-07-12 13:54:18.154100] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:29.807 [2024-07-12 13:54:18.154182] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:30.742 13:54:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:30.742 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:30.742 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:30.742 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:30.742 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:30.742 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:30.742 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.742 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:30.742 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:30.742 "name": "raid_bdev1", 00:28:30.742 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:30.742 "strip_size_kb": 0, 00:28:30.742 "state": "online", 00:28:30.742 "raid_level": "raid1", 00:28:30.742 "superblock": true, 00:28:30.742 "num_base_bdevs": 2, 00:28:30.742 "num_base_bdevs_discovered": 2, 00:28:30.742 "num_base_bdevs_operational": 2, 00:28:30.742 "base_bdevs_list": [ 00:28:30.742 { 00:28:30.742 "name": "spare", 00:28:30.742 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:30.742 "is_configured": true, 00:28:30.742 "data_offset": 256, 00:28:30.742 "data_size": 7936 00:28:30.742 }, 00:28:30.742 { 00:28:30.742 "name": "BaseBdev2", 00:28:30.742 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:30.742 "is_configured": true, 00:28:30.742 "data_offset": 256, 00:28:30.742 "data_size": 7936 00:28:30.742 } 00:28:30.742 ] 00:28:30.742 }' 00:28:30.742 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.001 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:31.260 "name": "raid_bdev1", 00:28:31.260 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:31.260 "strip_size_kb": 0, 00:28:31.260 "state": "online", 00:28:31.260 "raid_level": "raid1", 00:28:31.260 "superblock": true, 00:28:31.260 "num_base_bdevs": 2, 00:28:31.260 "num_base_bdevs_discovered": 2, 00:28:31.260 "num_base_bdevs_operational": 2, 00:28:31.260 "base_bdevs_list": [ 00:28:31.260 { 00:28:31.260 "name": "spare", 00:28:31.260 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:31.260 "is_configured": true, 00:28:31.260 "data_offset": 256, 00:28:31.260 "data_size": 7936 00:28:31.260 }, 00:28:31.260 { 00:28:31.260 "name": "BaseBdev2", 00:28:31.260 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:31.260 "is_configured": true, 00:28:31.260 "data_offset": 256, 00:28:31.260 "data_size": 7936 00:28:31.260 } 00:28:31.260 ] 00:28:31.260 }' 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.260 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.520 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.520 "name": "raid_bdev1", 00:28:31.520 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:31.520 "strip_size_kb": 0, 00:28:31.520 "state": "online", 00:28:31.520 "raid_level": "raid1", 00:28:31.520 "superblock": true, 00:28:31.520 "num_base_bdevs": 2, 00:28:31.520 "num_base_bdevs_discovered": 2, 00:28:31.520 "num_base_bdevs_operational": 2, 00:28:31.520 "base_bdevs_list": [ 00:28:31.520 { 00:28:31.520 "name": "spare", 00:28:31.520 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:31.520 "is_configured": true, 00:28:31.520 "data_offset": 256, 00:28:31.520 "data_size": 7936 00:28:31.520 }, 00:28:31.520 { 00:28:31.520 "name": "BaseBdev2", 00:28:31.520 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:31.520 "is_configured": true, 00:28:31.520 "data_offset": 256, 00:28:31.520 "data_size": 7936 00:28:31.520 } 00:28:31.520 ] 00:28:31.520 }' 00:28:31.520 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.520 13:54:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:32.088 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:32.347 [2024-07-12 13:54:20.752402] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:32.347 [2024-07-12 13:54:20.752432] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:32.347 [2024-07-12 13:54:20.752490] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:32.347 [2024-07-12 13:54:20.752545] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:32.347 [2024-07-12 13:54:20.752557] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x183fb00 name raid_bdev1, state offline 00:28:32.347 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.347 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:32.606 13:54:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:32.866 /dev/nbd0 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:32.866 1+0 records in 00:28:32.866 1+0 records out 00:28:32.866 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257696 s, 15.9 MB/s 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:32.866 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:33.126 /dev/nbd1 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:33.126 1+0 records in 00:28:33.126 1+0 records out 00:28:33.126 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315773 s, 13.0 MB/s 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:33.126 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:33.385 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:33.385 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:33.385 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:33.385 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:33.385 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:33.385 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:33.385 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:33.385 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:33.385 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:33.385 13:54:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:33.644 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:33.644 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:33.644 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:33.644 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:33.644 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:33.644 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:33.644 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:33.644 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:33.644 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:33.644 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:33.904 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:34.162 [2024-07-12 13:54:22.614080] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:34.162 [2024-07-12 13:54:22.614137] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:34.162 [2024-07-12 13:54:22.614160] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x183f8a0 00:28:34.162 [2024-07-12 13:54:22.614173] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:34.162 [2024-07-12 13:54:22.615734] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:34.162 [2024-07-12 13:54:22.615767] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:34.162 [2024-07-12 13:54:22.615837] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:34.162 [2024-07-12 13:54:22.615863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:34.162 [2024-07-12 13:54:22.615973] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:34.162 spare 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.162 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:34.162 [2024-07-12 13:54:22.716285] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1751100 00:28:34.162 [2024-07-12 13:54:22.716306] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:34.162 [2024-07-12 13:54:22.716397] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1750dc0 00:28:34.162 [2024-07-12 13:54:22.716534] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1751100 00:28:34.162 [2024-07-12 13:54:22.716544] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1751100 00:28:34.162 [2024-07-12 13:54:22.716627] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:34.421 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:34.421 "name": "raid_bdev1", 00:28:34.421 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:34.421 "strip_size_kb": 0, 00:28:34.421 "state": "online", 00:28:34.421 "raid_level": "raid1", 00:28:34.421 "superblock": true, 00:28:34.421 "num_base_bdevs": 2, 00:28:34.421 "num_base_bdevs_discovered": 2, 00:28:34.421 "num_base_bdevs_operational": 2, 00:28:34.421 "base_bdevs_list": [ 00:28:34.421 { 00:28:34.421 "name": "spare", 00:28:34.421 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:34.421 "is_configured": true, 00:28:34.421 "data_offset": 256, 00:28:34.421 "data_size": 7936 00:28:34.421 }, 00:28:34.421 { 00:28:34.421 "name": "BaseBdev2", 00:28:34.421 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:34.421 "is_configured": true, 00:28:34.421 "data_offset": 256, 00:28:34.421 "data_size": 7936 00:28:34.421 } 00:28:34.421 ] 00:28:34.421 }' 00:28:34.421 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:34.421 13:54:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:34.992 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:34.992 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:34.992 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:34.992 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:34.992 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:34.992 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.992 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:35.251 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:35.251 "name": "raid_bdev1", 00:28:35.251 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:35.251 "strip_size_kb": 0, 00:28:35.251 "state": "online", 00:28:35.251 "raid_level": "raid1", 00:28:35.251 "superblock": true, 00:28:35.251 "num_base_bdevs": 2, 00:28:35.251 "num_base_bdevs_discovered": 2, 00:28:35.251 "num_base_bdevs_operational": 2, 00:28:35.251 "base_bdevs_list": [ 00:28:35.251 { 00:28:35.251 "name": "spare", 00:28:35.251 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:35.251 "is_configured": true, 00:28:35.251 "data_offset": 256, 00:28:35.251 "data_size": 7936 00:28:35.251 }, 00:28:35.251 { 00:28:35.251 "name": "BaseBdev2", 00:28:35.251 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:35.251 "is_configured": true, 00:28:35.251 "data_offset": 256, 00:28:35.251 "data_size": 7936 00:28:35.251 } 00:28:35.251 ] 00:28:35.251 }' 00:28:35.251 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:35.251 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:35.251 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:35.251 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:35.251 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.251 13:54:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:35.510 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:35.510 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:35.769 [2024-07-12 13:54:24.294670] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:35.769 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.770 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:36.029 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:36.029 "name": "raid_bdev1", 00:28:36.029 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:36.029 "strip_size_kb": 0, 00:28:36.029 "state": "online", 00:28:36.029 "raid_level": "raid1", 00:28:36.029 "superblock": true, 00:28:36.029 "num_base_bdevs": 2, 00:28:36.029 "num_base_bdevs_discovered": 1, 00:28:36.029 "num_base_bdevs_operational": 1, 00:28:36.029 "base_bdevs_list": [ 00:28:36.029 { 00:28:36.029 "name": null, 00:28:36.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:36.029 "is_configured": false, 00:28:36.029 "data_offset": 256, 00:28:36.029 "data_size": 7936 00:28:36.029 }, 00:28:36.029 { 00:28:36.029 "name": "BaseBdev2", 00:28:36.029 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:36.029 "is_configured": true, 00:28:36.029 "data_offset": 256, 00:28:36.029 "data_size": 7936 00:28:36.029 } 00:28:36.029 ] 00:28:36.029 }' 00:28:36.029 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:36.029 13:54:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:36.597 13:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:36.856 [2024-07-12 13:54:25.373534] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:36.856 [2024-07-12 13:54:25.373696] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:36.856 [2024-07-12 13:54:25.373712] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:36.856 [2024-07-12 13:54:25.373747] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:36.856 [2024-07-12 13:54:25.375989] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16eabc0 00:28:36.856 [2024-07-12 13:54:25.377330] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:36.856 13:54:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:38.233 "name": "raid_bdev1", 00:28:38.233 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:38.233 "strip_size_kb": 0, 00:28:38.233 "state": "online", 00:28:38.233 "raid_level": "raid1", 00:28:38.233 "superblock": true, 00:28:38.233 "num_base_bdevs": 2, 00:28:38.233 "num_base_bdevs_discovered": 2, 00:28:38.233 "num_base_bdevs_operational": 2, 00:28:38.233 "process": { 00:28:38.233 "type": "rebuild", 00:28:38.233 "target": "spare", 00:28:38.233 "progress": { 00:28:38.233 "blocks": 3072, 00:28:38.233 "percent": 38 00:28:38.233 } 00:28:38.233 }, 00:28:38.233 "base_bdevs_list": [ 00:28:38.233 { 00:28:38.233 "name": "spare", 00:28:38.233 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:38.233 "is_configured": true, 00:28:38.233 "data_offset": 256, 00:28:38.233 "data_size": 7936 00:28:38.233 }, 00:28:38.233 { 00:28:38.233 "name": "BaseBdev2", 00:28:38.233 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:38.233 "is_configured": true, 00:28:38.233 "data_offset": 256, 00:28:38.233 "data_size": 7936 00:28:38.233 } 00:28:38.233 ] 00:28:38.233 }' 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:38.233 13:54:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:38.531 [2024-07-12 13:54:26.971877] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:38.531 [2024-07-12 13:54:26.990123] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:38.531 [2024-07-12 13:54:26.990168] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:38.531 [2024-07-12 13:54:26.990183] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:38.531 [2024-07-12 13:54:26.990191] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.531 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.816 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:38.816 "name": "raid_bdev1", 00:28:38.816 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:38.816 "strip_size_kb": 0, 00:28:38.816 "state": "online", 00:28:38.816 "raid_level": "raid1", 00:28:38.816 "superblock": true, 00:28:38.816 "num_base_bdevs": 2, 00:28:38.816 "num_base_bdevs_discovered": 1, 00:28:38.816 "num_base_bdevs_operational": 1, 00:28:38.816 "base_bdevs_list": [ 00:28:38.816 { 00:28:38.816 "name": null, 00:28:38.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:38.816 "is_configured": false, 00:28:38.816 "data_offset": 256, 00:28:38.816 "data_size": 7936 00:28:38.816 }, 00:28:38.816 { 00:28:38.816 "name": "BaseBdev2", 00:28:38.816 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:38.816 "is_configured": true, 00:28:38.816 "data_offset": 256, 00:28:38.816 "data_size": 7936 00:28:38.816 } 00:28:38.816 ] 00:28:38.816 }' 00:28:38.816 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:38.816 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:39.384 13:54:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:39.643 [2024-07-12 13:54:28.092144] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:39.643 [2024-07-12 13:54:28.092201] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:39.643 [2024-07-12 13:54:28.092227] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17505e0 00:28:39.643 [2024-07-12 13:54:28.092240] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:39.643 [2024-07-12 13:54:28.092478] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:39.643 [2024-07-12 13:54:28.092495] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:39.643 [2024-07-12 13:54:28.092555] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:39.643 [2024-07-12 13:54:28.092567] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:39.643 [2024-07-12 13:54:28.092579] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:39.643 [2024-07-12 13:54:28.092598] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:39.643 [2024-07-12 13:54:28.094823] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16eabc0 00:28:39.643 [2024-07-12 13:54:28.096171] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:39.643 spare 00:28:39.643 13:54:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:40.581 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:40.581 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:40.581 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:40.581 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:40.581 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:40.581 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.581 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.840 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:40.840 "name": "raid_bdev1", 00:28:40.840 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:40.840 "strip_size_kb": 0, 00:28:40.840 "state": "online", 00:28:40.840 "raid_level": "raid1", 00:28:40.840 "superblock": true, 00:28:40.840 "num_base_bdevs": 2, 00:28:40.840 "num_base_bdevs_discovered": 2, 00:28:40.840 "num_base_bdevs_operational": 2, 00:28:40.840 "process": { 00:28:40.840 "type": "rebuild", 00:28:40.840 "target": "spare", 00:28:40.840 "progress": { 00:28:40.840 "blocks": 3072, 00:28:40.840 "percent": 38 00:28:40.840 } 00:28:40.840 }, 00:28:40.840 "base_bdevs_list": [ 00:28:40.840 { 00:28:40.840 "name": "spare", 00:28:40.840 "uuid": "40a18204-be82-538b-bd59-c61984a9e260", 00:28:40.840 "is_configured": true, 00:28:40.840 "data_offset": 256, 00:28:40.840 "data_size": 7936 00:28:40.840 }, 00:28:40.840 { 00:28:40.840 "name": "BaseBdev2", 00:28:40.840 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:40.840 "is_configured": true, 00:28:40.840 "data_offset": 256, 00:28:40.840 "data_size": 7936 00:28:40.840 } 00:28:40.840 ] 00:28:40.840 }' 00:28:40.840 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:41.100 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:41.100 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:41.100 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:41.100 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:41.359 [2024-07-12 13:54:29.697195] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:41.359 [2024-07-12 13:54:29.708898] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:41.359 [2024-07-12 13:54:29.708946] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:41.359 [2024-07-12 13:54:29.708962] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:41.359 [2024-07-12 13:54:29.708970] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.359 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.618 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:41.618 "name": "raid_bdev1", 00:28:41.618 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:41.618 "strip_size_kb": 0, 00:28:41.618 "state": "online", 00:28:41.618 "raid_level": "raid1", 00:28:41.619 "superblock": true, 00:28:41.619 "num_base_bdevs": 2, 00:28:41.619 "num_base_bdevs_discovered": 1, 00:28:41.619 "num_base_bdevs_operational": 1, 00:28:41.619 "base_bdevs_list": [ 00:28:41.619 { 00:28:41.619 "name": null, 00:28:41.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:41.619 "is_configured": false, 00:28:41.619 "data_offset": 256, 00:28:41.619 "data_size": 7936 00:28:41.619 }, 00:28:41.619 { 00:28:41.619 "name": "BaseBdev2", 00:28:41.619 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:41.619 "is_configured": true, 00:28:41.619 "data_offset": 256, 00:28:41.619 "data_size": 7936 00:28:41.619 } 00:28:41.619 ] 00:28:41.619 }' 00:28:41.619 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:41.619 13:54:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:42.188 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:42.188 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:42.188 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:42.188 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:42.188 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:42.188 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:42.188 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:42.447 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:42.447 "name": "raid_bdev1", 00:28:42.447 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:42.447 "strip_size_kb": 0, 00:28:42.447 "state": "online", 00:28:42.447 "raid_level": "raid1", 00:28:42.447 "superblock": true, 00:28:42.447 "num_base_bdevs": 2, 00:28:42.447 "num_base_bdevs_discovered": 1, 00:28:42.447 "num_base_bdevs_operational": 1, 00:28:42.447 "base_bdevs_list": [ 00:28:42.447 { 00:28:42.447 "name": null, 00:28:42.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:42.447 "is_configured": false, 00:28:42.447 "data_offset": 256, 00:28:42.447 "data_size": 7936 00:28:42.447 }, 00:28:42.447 { 00:28:42.447 "name": "BaseBdev2", 00:28:42.447 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:42.447 "is_configured": true, 00:28:42.447 "data_offset": 256, 00:28:42.447 "data_size": 7936 00:28:42.447 } 00:28:42.447 ] 00:28:42.447 }' 00:28:42.447 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:42.447 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:42.447 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:42.447 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:42.447 13:54:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:42.706 13:54:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:43.274 [2024-07-12 13:54:31.588868] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:43.274 [2024-07-12 13:54:31.588924] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:43.274 [2024-07-12 13:54:31.588954] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16eb240 00:28:43.274 [2024-07-12 13:54:31.588967] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:43.274 [2024-07-12 13:54:31.589185] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:43.274 [2024-07-12 13:54:31.589204] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:43.274 [2024-07-12 13:54:31.589254] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:43.274 [2024-07-12 13:54:31.589267] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:43.274 [2024-07-12 13:54:31.589280] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:43.274 BaseBdev1 00:28:43.274 13:54:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:44.211 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:44.211 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:44.211 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:44.211 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:44.211 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:44.211 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:44.211 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:44.211 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:44.211 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:44.212 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:44.212 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.212 13:54:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:44.779 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:44.779 "name": "raid_bdev1", 00:28:44.779 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:44.779 "strip_size_kb": 0, 00:28:44.779 "state": "online", 00:28:44.779 "raid_level": "raid1", 00:28:44.779 "superblock": true, 00:28:44.779 "num_base_bdevs": 2, 00:28:44.779 "num_base_bdevs_discovered": 1, 00:28:44.779 "num_base_bdevs_operational": 1, 00:28:44.779 "base_bdevs_list": [ 00:28:44.779 { 00:28:44.779 "name": null, 00:28:44.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:44.779 "is_configured": false, 00:28:44.779 "data_offset": 256, 00:28:44.779 "data_size": 7936 00:28:44.779 }, 00:28:44.779 { 00:28:44.779 "name": "BaseBdev2", 00:28:44.779 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:44.779 "is_configured": true, 00:28:44.779 "data_offset": 256, 00:28:44.779 "data_size": 7936 00:28:44.779 } 00:28:44.779 ] 00:28:44.779 }' 00:28:44.779 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:44.779 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:45.346 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:45.346 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:45.346 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:45.346 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:45.346 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:45.346 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:45.346 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:45.604 13:54:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:45.604 "name": "raid_bdev1", 00:28:45.604 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:45.604 "strip_size_kb": 0, 00:28:45.604 "state": "online", 00:28:45.604 "raid_level": "raid1", 00:28:45.604 "superblock": true, 00:28:45.604 "num_base_bdevs": 2, 00:28:45.604 "num_base_bdevs_discovered": 1, 00:28:45.604 "num_base_bdevs_operational": 1, 00:28:45.604 "base_bdevs_list": [ 00:28:45.604 { 00:28:45.604 "name": null, 00:28:45.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:45.604 "is_configured": false, 00:28:45.604 "data_offset": 256, 00:28:45.604 "data_size": 7936 00:28:45.604 }, 00:28:45.604 { 00:28:45.604 "name": "BaseBdev2", 00:28:45.604 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:45.604 "is_configured": true, 00:28:45.604 "data_offset": 256, 00:28:45.604 "data_size": 7936 00:28:45.604 } 00:28:45.604 ] 00:28:45.604 }' 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:45.604 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:45.863 [2024-07-12 13:54:34.348214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:45.863 [2024-07-12 13:54:34.348358] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:45.863 [2024-07-12 13:54:34.348375] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:45.863 request: 00:28:45.863 { 00:28:45.863 "base_bdev": "BaseBdev1", 00:28:45.863 "raid_bdev": "raid_bdev1", 00:28:45.863 "method": "bdev_raid_add_base_bdev", 00:28:45.863 "req_id": 1 00:28:45.863 } 00:28:45.863 Got JSON-RPC error response 00:28:45.863 response: 00:28:45.863 { 00:28:45.863 "code": -22, 00:28:45.863 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:45.863 } 00:28:45.863 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:45.863 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:45.863 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:45.863 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:45.863 13:54:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.798 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.057 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:47.057 "name": "raid_bdev1", 00:28:47.057 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:47.057 "strip_size_kb": 0, 00:28:47.057 "state": "online", 00:28:47.057 "raid_level": "raid1", 00:28:47.057 "superblock": true, 00:28:47.057 "num_base_bdevs": 2, 00:28:47.057 "num_base_bdevs_discovered": 1, 00:28:47.057 "num_base_bdevs_operational": 1, 00:28:47.057 "base_bdevs_list": [ 00:28:47.057 { 00:28:47.057 "name": null, 00:28:47.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:47.057 "is_configured": false, 00:28:47.057 "data_offset": 256, 00:28:47.057 "data_size": 7936 00:28:47.057 }, 00:28:47.057 { 00:28:47.057 "name": "BaseBdev2", 00:28:47.057 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:47.057 "is_configured": true, 00:28:47.057 "data_offset": 256, 00:28:47.057 "data_size": 7936 00:28:47.057 } 00:28:47.057 ] 00:28:47.057 }' 00:28:47.057 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:47.057 13:54:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:47.994 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:47.994 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:47.994 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:47.994 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:47.994 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:47.994 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.994 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.994 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:47.994 "name": "raid_bdev1", 00:28:47.994 "uuid": "3cdac684-e213-4255-9f68-4395cce63ba2", 00:28:47.994 "strip_size_kb": 0, 00:28:47.994 "state": "online", 00:28:47.994 "raid_level": "raid1", 00:28:47.994 "superblock": true, 00:28:47.994 "num_base_bdevs": 2, 00:28:47.994 "num_base_bdevs_discovered": 1, 00:28:47.994 "num_base_bdevs_operational": 1, 00:28:47.994 "base_bdevs_list": [ 00:28:47.994 { 00:28:47.994 "name": null, 00:28:47.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:47.994 "is_configured": false, 00:28:47.994 "data_offset": 256, 00:28:47.994 "data_size": 7936 00:28:47.994 }, 00:28:47.994 { 00:28:47.994 "name": "BaseBdev2", 00:28:47.994 "uuid": "44552109-2905-514e-9479-ac86769188e7", 00:28:47.994 "is_configured": true, 00:28:47.994 "data_offset": 256, 00:28:47.994 "data_size": 7936 00:28:47.994 } 00:28:47.994 ] 00:28:47.994 }' 00:28:47.995 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:47.995 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:47.995 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:47.995 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:47.995 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 584391 00:28:47.995 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 584391 ']' 00:28:47.995 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 584391 00:28:47.995 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:48.254 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:48.254 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 584391 00:28:48.254 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:48.254 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:48.254 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 584391' 00:28:48.254 killing process with pid 584391 00:28:48.254 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 584391 00:28:48.254 Received shutdown signal, test time was about 60.000000 seconds 00:28:48.254 00:28:48.254 Latency(us) 00:28:48.254 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:48.254 =================================================================================================================== 00:28:48.254 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:48.254 [2024-07-12 13:54:36.619128] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:48.254 [2024-07-12 13:54:36.619220] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:48.254 [2024-07-12 13:54:36.619265] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:48.254 [2024-07-12 13:54:36.619279] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1751100 name raid_bdev1, state offline 00:28:48.254 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 584391 00:28:48.254 [2024-07-12 13:54:36.653896] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:48.513 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:28:48.513 00:28:48.513 real 0m31.859s 00:28:48.513 user 0m49.797s 00:28:48.513 sys 0m5.234s 00:28:48.513 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:48.513 13:54:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:48.513 ************************************ 00:28:48.513 END TEST raid_rebuild_test_sb_md_separate 00:28:48.513 ************************************ 00:28:48.513 13:54:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:48.513 13:54:36 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:28:48.513 13:54:36 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:28:48.513 13:54:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:48.513 13:54:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:48.513 13:54:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:48.513 ************************************ 00:28:48.513 START TEST raid_state_function_test_sb_md_interleaved 00:28:48.513 ************************************ 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=589124 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 589124' 00:28:48.513 Process raid pid: 589124 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 589124 /var/tmp/spdk-raid.sock 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 589124 ']' 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:48.513 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:48.513 13:54:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:48.513 [2024-07-12 13:54:37.031465] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:28:48.513 [2024-07-12 13:54:37.031542] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:48.772 [2024-07-12 13:54:37.159691] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:48.772 [2024-07-12 13:54:37.262544] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:48.772 [2024-07-12 13:54:37.322397] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:48.772 [2024-07-12 13:54:37.322425] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:49.707 13:54:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:49.707 13:54:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:49.707 13:54:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:49.707 [2024-07-12 13:54:38.192251] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:49.707 [2024-07-12 13:54:38.192293] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:49.707 [2024-07-12 13:54:38.192304] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:49.707 [2024-07-12 13:54:38.192316] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.707 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:49.966 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:49.966 "name": "Existed_Raid", 00:28:49.966 "uuid": "2c929c7f-98ff-43b3-b279-3168d1bab75e", 00:28:49.966 "strip_size_kb": 0, 00:28:49.966 "state": "configuring", 00:28:49.966 "raid_level": "raid1", 00:28:49.966 "superblock": true, 00:28:49.966 "num_base_bdevs": 2, 00:28:49.966 "num_base_bdevs_discovered": 0, 00:28:49.966 "num_base_bdevs_operational": 2, 00:28:49.966 "base_bdevs_list": [ 00:28:49.966 { 00:28:49.966 "name": "BaseBdev1", 00:28:49.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:49.966 "is_configured": false, 00:28:49.966 "data_offset": 0, 00:28:49.966 "data_size": 0 00:28:49.966 }, 00:28:49.966 { 00:28:49.966 "name": "BaseBdev2", 00:28:49.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:49.966 "is_configured": false, 00:28:49.966 "data_offset": 0, 00:28:49.966 "data_size": 0 00:28:49.966 } 00:28:49.966 ] 00:28:49.966 }' 00:28:49.966 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:49.966 13:54:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:50.533 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:50.791 [2024-07-12 13:54:39.234875] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:50.791 [2024-07-12 13:54:39.234908] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1818330 name Existed_Raid, state configuring 00:28:50.791 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:51.049 [2024-07-12 13:54:39.479529] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:51.049 [2024-07-12 13:54:39.479564] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:51.049 [2024-07-12 13:54:39.479574] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:51.049 [2024-07-12 13:54:39.479586] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:51.049 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:28:51.308 [2024-07-12 13:54:39.735504] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:51.308 BaseBdev1 00:28:51.308 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:51.308 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:51.308 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:51.308 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:28:51.308 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:51.308 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:51.308 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:51.567 13:54:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:51.827 [ 00:28:51.827 { 00:28:51.827 "name": "BaseBdev1", 00:28:51.827 "aliases": [ 00:28:51.827 "489a72e7-2834-49ab-ab35-f656a177543a" 00:28:51.827 ], 00:28:51.827 "product_name": "Malloc disk", 00:28:51.827 "block_size": 4128, 00:28:51.827 "num_blocks": 8192, 00:28:51.827 "uuid": "489a72e7-2834-49ab-ab35-f656a177543a", 00:28:51.827 "md_size": 32, 00:28:51.827 "md_interleave": true, 00:28:51.827 "dif_type": 0, 00:28:51.827 "assigned_rate_limits": { 00:28:51.827 "rw_ios_per_sec": 0, 00:28:51.827 "rw_mbytes_per_sec": 0, 00:28:51.827 "r_mbytes_per_sec": 0, 00:28:51.827 "w_mbytes_per_sec": 0 00:28:51.827 }, 00:28:51.827 "claimed": true, 00:28:51.827 "claim_type": "exclusive_write", 00:28:51.827 "zoned": false, 00:28:51.827 "supported_io_types": { 00:28:51.827 "read": true, 00:28:51.827 "write": true, 00:28:51.827 "unmap": true, 00:28:51.827 "flush": true, 00:28:51.827 "reset": true, 00:28:51.827 "nvme_admin": false, 00:28:51.827 "nvme_io": false, 00:28:51.827 "nvme_io_md": false, 00:28:51.827 "write_zeroes": true, 00:28:51.827 "zcopy": true, 00:28:51.827 "get_zone_info": false, 00:28:51.827 "zone_management": false, 00:28:51.827 "zone_append": false, 00:28:51.827 "compare": false, 00:28:51.827 "compare_and_write": false, 00:28:51.827 "abort": true, 00:28:51.827 "seek_hole": false, 00:28:51.827 "seek_data": false, 00:28:51.827 "copy": true, 00:28:51.827 "nvme_iov_md": false 00:28:51.827 }, 00:28:51.827 "memory_domains": [ 00:28:51.827 { 00:28:51.827 "dma_device_id": "system", 00:28:51.827 "dma_device_type": 1 00:28:51.827 }, 00:28:51.827 { 00:28:51.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:51.827 "dma_device_type": 2 00:28:51.827 } 00:28:51.827 ], 00:28:51.827 "driver_specific": {} 00:28:51.827 } 00:28:51.827 ] 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:51.827 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:52.086 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:52.086 "name": "Existed_Raid", 00:28:52.086 "uuid": "cd37b2a8-d0f2-49d6-8900-187ebc42ce1e", 00:28:52.086 "strip_size_kb": 0, 00:28:52.086 "state": "configuring", 00:28:52.086 "raid_level": "raid1", 00:28:52.086 "superblock": true, 00:28:52.086 "num_base_bdevs": 2, 00:28:52.086 "num_base_bdevs_discovered": 1, 00:28:52.086 "num_base_bdevs_operational": 2, 00:28:52.086 "base_bdevs_list": [ 00:28:52.086 { 00:28:52.086 "name": "BaseBdev1", 00:28:52.086 "uuid": "489a72e7-2834-49ab-ab35-f656a177543a", 00:28:52.086 "is_configured": true, 00:28:52.086 "data_offset": 256, 00:28:52.086 "data_size": 7936 00:28:52.086 }, 00:28:52.086 { 00:28:52.086 "name": "BaseBdev2", 00:28:52.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:52.086 "is_configured": false, 00:28:52.086 "data_offset": 0, 00:28:52.086 "data_size": 0 00:28:52.086 } 00:28:52.086 ] 00:28:52.086 }' 00:28:52.086 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:52.086 13:54:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:52.654 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:52.913 [2024-07-12 13:54:41.307716] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:52.913 [2024-07-12 13:54:41.307763] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1817c20 name Existed_Raid, state configuring 00:28:52.913 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:53.172 [2024-07-12 13:54:41.552411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:53.172 [2024-07-12 13:54:41.553952] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:53.172 [2024-07-12 13:54:41.553988] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:53.172 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:53.432 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:53.432 "name": "Existed_Raid", 00:28:53.432 "uuid": "5a29036b-d9ad-4882-841f-9a1d66338721", 00:28:53.432 "strip_size_kb": 0, 00:28:53.432 "state": "configuring", 00:28:53.432 "raid_level": "raid1", 00:28:53.432 "superblock": true, 00:28:53.432 "num_base_bdevs": 2, 00:28:53.432 "num_base_bdevs_discovered": 1, 00:28:53.432 "num_base_bdevs_operational": 2, 00:28:53.432 "base_bdevs_list": [ 00:28:53.432 { 00:28:53.432 "name": "BaseBdev1", 00:28:53.432 "uuid": "489a72e7-2834-49ab-ab35-f656a177543a", 00:28:53.432 "is_configured": true, 00:28:53.432 "data_offset": 256, 00:28:53.432 "data_size": 7936 00:28:53.432 }, 00:28:53.432 { 00:28:53.432 "name": "BaseBdev2", 00:28:53.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:53.432 "is_configured": false, 00:28:53.432 "data_offset": 0, 00:28:53.432 "data_size": 0 00:28:53.432 } 00:28:53.432 ] 00:28:53.432 }' 00:28:53.432 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:53.432 13:54:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:53.998 13:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:28:54.258 [2024-07-12 13:54:42.630897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:54.258 [2024-07-12 13:54:42.631047] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1819ac0 00:28:54.258 [2024-07-12 13:54:42.631060] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:54.258 [2024-07-12 13:54:42.631124] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1817320 00:28:54.258 [2024-07-12 13:54:42.631205] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1819ac0 00:28:54.258 [2024-07-12 13:54:42.631214] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1819ac0 00:28:54.258 [2024-07-12 13:54:42.631273] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:54.258 BaseBdev2 00:28:54.258 13:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:54.258 13:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:54.258 13:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:54.258 13:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:28:54.258 13:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:54.258 13:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:54.258 13:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:54.518 13:54:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:54.777 [ 00:28:54.777 { 00:28:54.777 "name": "BaseBdev2", 00:28:54.777 "aliases": [ 00:28:54.777 "2fa0d4c5-be18-468f-8a36-230e7c42f986" 00:28:54.777 ], 00:28:54.777 "product_name": "Malloc disk", 00:28:54.777 "block_size": 4128, 00:28:54.777 "num_blocks": 8192, 00:28:54.777 "uuid": "2fa0d4c5-be18-468f-8a36-230e7c42f986", 00:28:54.777 "md_size": 32, 00:28:54.777 "md_interleave": true, 00:28:54.777 "dif_type": 0, 00:28:54.777 "assigned_rate_limits": { 00:28:54.777 "rw_ios_per_sec": 0, 00:28:54.777 "rw_mbytes_per_sec": 0, 00:28:54.777 "r_mbytes_per_sec": 0, 00:28:54.777 "w_mbytes_per_sec": 0 00:28:54.777 }, 00:28:54.777 "claimed": true, 00:28:54.777 "claim_type": "exclusive_write", 00:28:54.777 "zoned": false, 00:28:54.777 "supported_io_types": { 00:28:54.777 "read": true, 00:28:54.777 "write": true, 00:28:54.777 "unmap": true, 00:28:54.777 "flush": true, 00:28:54.777 "reset": true, 00:28:54.777 "nvme_admin": false, 00:28:54.777 "nvme_io": false, 00:28:54.777 "nvme_io_md": false, 00:28:54.777 "write_zeroes": true, 00:28:54.777 "zcopy": true, 00:28:54.777 "get_zone_info": false, 00:28:54.777 "zone_management": false, 00:28:54.777 "zone_append": false, 00:28:54.777 "compare": false, 00:28:54.777 "compare_and_write": false, 00:28:54.777 "abort": true, 00:28:54.777 "seek_hole": false, 00:28:54.777 "seek_data": false, 00:28:54.777 "copy": true, 00:28:54.777 "nvme_iov_md": false 00:28:54.777 }, 00:28:54.777 "memory_domains": [ 00:28:54.777 { 00:28:54.777 "dma_device_id": "system", 00:28:54.777 "dma_device_type": 1 00:28:54.777 }, 00:28:54.777 { 00:28:54.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:54.777 "dma_device_type": 2 00:28:54.777 } 00:28:54.777 ], 00:28:54.777 "driver_specific": {} 00:28:54.777 } 00:28:54.777 ] 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:54.777 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.035 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:55.035 "name": "Existed_Raid", 00:28:55.035 "uuid": "5a29036b-d9ad-4882-841f-9a1d66338721", 00:28:55.035 "strip_size_kb": 0, 00:28:55.035 "state": "online", 00:28:55.036 "raid_level": "raid1", 00:28:55.036 "superblock": true, 00:28:55.036 "num_base_bdevs": 2, 00:28:55.036 "num_base_bdevs_discovered": 2, 00:28:55.036 "num_base_bdevs_operational": 2, 00:28:55.036 "base_bdevs_list": [ 00:28:55.036 { 00:28:55.036 "name": "BaseBdev1", 00:28:55.036 "uuid": "489a72e7-2834-49ab-ab35-f656a177543a", 00:28:55.036 "is_configured": true, 00:28:55.036 "data_offset": 256, 00:28:55.036 "data_size": 7936 00:28:55.036 }, 00:28:55.036 { 00:28:55.036 "name": "BaseBdev2", 00:28:55.036 "uuid": "2fa0d4c5-be18-468f-8a36-230e7c42f986", 00:28:55.036 "is_configured": true, 00:28:55.036 "data_offset": 256, 00:28:55.036 "data_size": 7936 00:28:55.036 } 00:28:55.036 ] 00:28:55.036 }' 00:28:55.036 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:55.036 13:54:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:55.603 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:55.603 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:55.603 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:55.603 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:55.603 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:55.603 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:55.603 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:55.603 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:55.862 [2024-07-12 13:54:44.231483] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:55.862 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:55.862 "name": "Existed_Raid", 00:28:55.862 "aliases": [ 00:28:55.862 "5a29036b-d9ad-4882-841f-9a1d66338721" 00:28:55.862 ], 00:28:55.863 "product_name": "Raid Volume", 00:28:55.863 "block_size": 4128, 00:28:55.863 "num_blocks": 7936, 00:28:55.863 "uuid": "5a29036b-d9ad-4882-841f-9a1d66338721", 00:28:55.863 "md_size": 32, 00:28:55.863 "md_interleave": true, 00:28:55.863 "dif_type": 0, 00:28:55.863 "assigned_rate_limits": { 00:28:55.863 "rw_ios_per_sec": 0, 00:28:55.863 "rw_mbytes_per_sec": 0, 00:28:55.863 "r_mbytes_per_sec": 0, 00:28:55.863 "w_mbytes_per_sec": 0 00:28:55.863 }, 00:28:55.863 "claimed": false, 00:28:55.863 "zoned": false, 00:28:55.863 "supported_io_types": { 00:28:55.863 "read": true, 00:28:55.863 "write": true, 00:28:55.863 "unmap": false, 00:28:55.863 "flush": false, 00:28:55.863 "reset": true, 00:28:55.863 "nvme_admin": false, 00:28:55.863 "nvme_io": false, 00:28:55.863 "nvme_io_md": false, 00:28:55.863 "write_zeroes": true, 00:28:55.863 "zcopy": false, 00:28:55.863 "get_zone_info": false, 00:28:55.863 "zone_management": false, 00:28:55.863 "zone_append": false, 00:28:55.863 "compare": false, 00:28:55.863 "compare_and_write": false, 00:28:55.863 "abort": false, 00:28:55.863 "seek_hole": false, 00:28:55.863 "seek_data": false, 00:28:55.863 "copy": false, 00:28:55.863 "nvme_iov_md": false 00:28:55.863 }, 00:28:55.863 "memory_domains": [ 00:28:55.863 { 00:28:55.863 "dma_device_id": "system", 00:28:55.863 "dma_device_type": 1 00:28:55.863 }, 00:28:55.863 { 00:28:55.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:55.863 "dma_device_type": 2 00:28:55.863 }, 00:28:55.863 { 00:28:55.863 "dma_device_id": "system", 00:28:55.863 "dma_device_type": 1 00:28:55.863 }, 00:28:55.863 { 00:28:55.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:55.863 "dma_device_type": 2 00:28:55.863 } 00:28:55.863 ], 00:28:55.863 "driver_specific": { 00:28:55.863 "raid": { 00:28:55.863 "uuid": "5a29036b-d9ad-4882-841f-9a1d66338721", 00:28:55.863 "strip_size_kb": 0, 00:28:55.863 "state": "online", 00:28:55.863 "raid_level": "raid1", 00:28:55.863 "superblock": true, 00:28:55.863 "num_base_bdevs": 2, 00:28:55.863 "num_base_bdevs_discovered": 2, 00:28:55.863 "num_base_bdevs_operational": 2, 00:28:55.863 "base_bdevs_list": [ 00:28:55.863 { 00:28:55.863 "name": "BaseBdev1", 00:28:55.863 "uuid": "489a72e7-2834-49ab-ab35-f656a177543a", 00:28:55.863 "is_configured": true, 00:28:55.863 "data_offset": 256, 00:28:55.863 "data_size": 7936 00:28:55.863 }, 00:28:55.863 { 00:28:55.863 "name": "BaseBdev2", 00:28:55.863 "uuid": "2fa0d4c5-be18-468f-8a36-230e7c42f986", 00:28:55.863 "is_configured": true, 00:28:55.863 "data_offset": 256, 00:28:55.863 "data_size": 7936 00:28:55.863 } 00:28:55.863 ] 00:28:55.863 } 00:28:55.863 } 00:28:55.863 }' 00:28:55.863 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:55.863 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:55.863 BaseBdev2' 00:28:55.863 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:55.863 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:55.863 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:56.122 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:56.122 "name": "BaseBdev1", 00:28:56.122 "aliases": [ 00:28:56.122 "489a72e7-2834-49ab-ab35-f656a177543a" 00:28:56.122 ], 00:28:56.122 "product_name": "Malloc disk", 00:28:56.122 "block_size": 4128, 00:28:56.122 "num_blocks": 8192, 00:28:56.122 "uuid": "489a72e7-2834-49ab-ab35-f656a177543a", 00:28:56.122 "md_size": 32, 00:28:56.122 "md_interleave": true, 00:28:56.122 "dif_type": 0, 00:28:56.122 "assigned_rate_limits": { 00:28:56.122 "rw_ios_per_sec": 0, 00:28:56.122 "rw_mbytes_per_sec": 0, 00:28:56.122 "r_mbytes_per_sec": 0, 00:28:56.122 "w_mbytes_per_sec": 0 00:28:56.122 }, 00:28:56.122 "claimed": true, 00:28:56.122 "claim_type": "exclusive_write", 00:28:56.122 "zoned": false, 00:28:56.122 "supported_io_types": { 00:28:56.122 "read": true, 00:28:56.122 "write": true, 00:28:56.122 "unmap": true, 00:28:56.122 "flush": true, 00:28:56.122 "reset": true, 00:28:56.122 "nvme_admin": false, 00:28:56.122 "nvme_io": false, 00:28:56.122 "nvme_io_md": false, 00:28:56.122 "write_zeroes": true, 00:28:56.122 "zcopy": true, 00:28:56.122 "get_zone_info": false, 00:28:56.122 "zone_management": false, 00:28:56.122 "zone_append": false, 00:28:56.122 "compare": false, 00:28:56.122 "compare_and_write": false, 00:28:56.122 "abort": true, 00:28:56.122 "seek_hole": false, 00:28:56.122 "seek_data": false, 00:28:56.122 "copy": true, 00:28:56.122 "nvme_iov_md": false 00:28:56.122 }, 00:28:56.122 "memory_domains": [ 00:28:56.122 { 00:28:56.122 "dma_device_id": "system", 00:28:56.122 "dma_device_type": 1 00:28:56.122 }, 00:28:56.122 { 00:28:56.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:56.122 "dma_device_type": 2 00:28:56.122 } 00:28:56.122 ], 00:28:56.122 "driver_specific": {} 00:28:56.122 }' 00:28:56.123 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:56.123 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:56.123 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:56.123 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:56.123 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:56.382 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:56.382 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:56.382 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:56.382 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:56.382 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:56.382 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:56.382 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:56.382 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:56.382 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:56.382 13:54:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:56.642 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:56.642 "name": "BaseBdev2", 00:28:56.642 "aliases": [ 00:28:56.642 "2fa0d4c5-be18-468f-8a36-230e7c42f986" 00:28:56.642 ], 00:28:56.642 "product_name": "Malloc disk", 00:28:56.642 "block_size": 4128, 00:28:56.642 "num_blocks": 8192, 00:28:56.642 "uuid": "2fa0d4c5-be18-468f-8a36-230e7c42f986", 00:28:56.642 "md_size": 32, 00:28:56.642 "md_interleave": true, 00:28:56.642 "dif_type": 0, 00:28:56.642 "assigned_rate_limits": { 00:28:56.642 "rw_ios_per_sec": 0, 00:28:56.642 "rw_mbytes_per_sec": 0, 00:28:56.642 "r_mbytes_per_sec": 0, 00:28:56.642 "w_mbytes_per_sec": 0 00:28:56.642 }, 00:28:56.642 "claimed": true, 00:28:56.642 "claim_type": "exclusive_write", 00:28:56.642 "zoned": false, 00:28:56.642 "supported_io_types": { 00:28:56.642 "read": true, 00:28:56.642 "write": true, 00:28:56.642 "unmap": true, 00:28:56.642 "flush": true, 00:28:56.642 "reset": true, 00:28:56.642 "nvme_admin": false, 00:28:56.642 "nvme_io": false, 00:28:56.642 "nvme_io_md": false, 00:28:56.642 "write_zeroes": true, 00:28:56.642 "zcopy": true, 00:28:56.642 "get_zone_info": false, 00:28:56.642 "zone_management": false, 00:28:56.642 "zone_append": false, 00:28:56.642 "compare": false, 00:28:56.642 "compare_and_write": false, 00:28:56.642 "abort": true, 00:28:56.642 "seek_hole": false, 00:28:56.642 "seek_data": false, 00:28:56.642 "copy": true, 00:28:56.642 "nvme_iov_md": false 00:28:56.642 }, 00:28:56.642 "memory_domains": [ 00:28:56.642 { 00:28:56.642 "dma_device_id": "system", 00:28:56.642 "dma_device_type": 1 00:28:56.642 }, 00:28:56.642 { 00:28:56.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:56.642 "dma_device_type": 2 00:28:56.642 } 00:28:56.642 ], 00:28:56.642 "driver_specific": {} 00:28:56.642 }' 00:28:56.642 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:56.642 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:56.901 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:56.901 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:56.901 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:56.901 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:56.901 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:56.901 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:56.901 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:56.901 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:56.901 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:57.161 [2024-07-12 13:54:45.638973] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.161 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:57.420 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:57.420 "name": "Existed_Raid", 00:28:57.420 "uuid": "5a29036b-d9ad-4882-841f-9a1d66338721", 00:28:57.420 "strip_size_kb": 0, 00:28:57.420 "state": "online", 00:28:57.420 "raid_level": "raid1", 00:28:57.420 "superblock": true, 00:28:57.420 "num_base_bdevs": 2, 00:28:57.420 "num_base_bdevs_discovered": 1, 00:28:57.420 "num_base_bdevs_operational": 1, 00:28:57.420 "base_bdevs_list": [ 00:28:57.420 { 00:28:57.420 "name": null, 00:28:57.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:57.420 "is_configured": false, 00:28:57.420 "data_offset": 256, 00:28:57.420 "data_size": 7936 00:28:57.420 }, 00:28:57.420 { 00:28:57.420 "name": "BaseBdev2", 00:28:57.420 "uuid": "2fa0d4c5-be18-468f-8a36-230e7c42f986", 00:28:57.420 "is_configured": true, 00:28:57.420 "data_offset": 256, 00:28:57.420 "data_size": 7936 00:28:57.420 } 00:28:57.420 ] 00:28:57.420 }' 00:28:57.420 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:57.420 13:54:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:57.988 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:57.988 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:57.988 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:57.988 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:58.247 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:58.247 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:58.247 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:58.507 [2024-07-12 13:54:46.899441] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:58.507 [2024-07-12 13:54:46.899527] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:58.507 [2024-07-12 13:54:46.910801] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:58.507 [2024-07-12 13:54:46.910837] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:58.507 [2024-07-12 13:54:46.910848] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1819ac0 name Existed_Raid, state offline 00:28:58.507 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:58.507 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:58.507 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:58.507 13:54:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 589124 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 589124 ']' 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 589124 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 589124 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 589124' 00:28:58.767 killing process with pid 589124 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 589124 00:28:58.767 [2024-07-12 13:54:47.230095] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:58.767 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 589124 00:28:58.767 [2024-07-12 13:54:47.230979] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:59.026 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:28:59.026 00:28:59.026 real 0m10.474s 00:28:59.026 user 0m18.477s 00:28:59.026 sys 0m2.080s 00:28:59.026 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:59.026 13:54:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:59.026 ************************************ 00:28:59.026 END TEST raid_state_function_test_sb_md_interleaved 00:28:59.026 ************************************ 00:28:59.026 13:54:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:59.026 13:54:47 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:28:59.026 13:54:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:59.026 13:54:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:59.026 13:54:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:59.026 ************************************ 00:28:59.026 START TEST raid_superblock_test_md_interleaved 00:28:59.026 ************************************ 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=590646 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 590646 /var/tmp/spdk-raid.sock 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 590646 ']' 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:59.026 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:59.026 13:54:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:59.285 [2024-07-12 13:54:47.640582] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:28:59.285 [2024-07-12 13:54:47.640721] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid590646 ] 00:28:59.285 [2024-07-12 13:54:47.838234] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:59.545 [2024-07-12 13:54:47.940350] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:59.545 [2024-07-12 13:54:48.007048] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:59.545 [2024-07-12 13:54:48.007088] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:00.113 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:29:00.372 malloc1 00:29:00.372 13:54:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:00.633 [2024-07-12 13:54:49.009501] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:00.633 [2024-07-12 13:54:49.009549] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:00.633 [2024-07-12 13:54:49.009571] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd3ce20 00:29:00.633 [2024-07-12 13:54:49.009583] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:00.633 [2024-07-12 13:54:49.011109] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:00.633 [2024-07-12 13:54:49.011138] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:00.633 pt1 00:29:00.633 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:00.633 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:00.633 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:29:00.633 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:29:00.633 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:00.633 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:00.633 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:00.633 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:00.633 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:29:00.891 malloc2 00:29:00.891 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:01.150 [2024-07-12 13:54:49.503828] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:01.150 [2024-07-12 13:54:49.503874] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:01.150 [2024-07-12 13:54:49.503894] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd21eb0 00:29:01.150 [2024-07-12 13:54:49.503907] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:01.150 [2024-07-12 13:54:49.505410] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:01.150 [2024-07-12 13:54:49.505440] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:01.150 pt2 00:29:01.150 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:01.150 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:01.150 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:01.409 [2024-07-12 13:54:49.744492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:01.409 [2024-07-12 13:54:49.746011] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:01.409 [2024-07-12 13:54:49.746171] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd237f0 00:29:01.410 [2024-07-12 13:54:49.746185] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:01.410 [2024-07-12 13:54:49.746259] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd240a0 00:29:01.410 [2024-07-12 13:54:49.746343] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd237f0 00:29:01.410 [2024-07-12 13:54:49.746353] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd237f0 00:29:01.410 [2024-07-12 13:54:49.746414] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.410 13:54:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:01.668 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:01.668 "name": "raid_bdev1", 00:29:01.668 "uuid": "ee1b6098-5cf6-476a-915f-a446dc3b5e21", 00:29:01.668 "strip_size_kb": 0, 00:29:01.668 "state": "online", 00:29:01.668 "raid_level": "raid1", 00:29:01.668 "superblock": true, 00:29:01.668 "num_base_bdevs": 2, 00:29:01.668 "num_base_bdevs_discovered": 2, 00:29:01.668 "num_base_bdevs_operational": 2, 00:29:01.668 "base_bdevs_list": [ 00:29:01.668 { 00:29:01.668 "name": "pt1", 00:29:01.668 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:01.668 "is_configured": true, 00:29:01.668 "data_offset": 256, 00:29:01.668 "data_size": 7936 00:29:01.668 }, 00:29:01.668 { 00:29:01.668 "name": "pt2", 00:29:01.668 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:01.668 "is_configured": true, 00:29:01.668 "data_offset": 256, 00:29:01.668 "data_size": 7936 00:29:01.668 } 00:29:01.668 ] 00:29:01.668 }' 00:29:01.668 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:01.668 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:02.236 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:29:02.236 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:02.236 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:02.236 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:02.236 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:02.236 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:02.236 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:02.236 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:02.495 [2024-07-12 13:54:50.863701] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:02.495 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:02.495 "name": "raid_bdev1", 00:29:02.495 "aliases": [ 00:29:02.495 "ee1b6098-5cf6-476a-915f-a446dc3b5e21" 00:29:02.495 ], 00:29:02.495 "product_name": "Raid Volume", 00:29:02.495 "block_size": 4128, 00:29:02.495 "num_blocks": 7936, 00:29:02.495 "uuid": "ee1b6098-5cf6-476a-915f-a446dc3b5e21", 00:29:02.495 "md_size": 32, 00:29:02.495 "md_interleave": true, 00:29:02.495 "dif_type": 0, 00:29:02.495 "assigned_rate_limits": { 00:29:02.495 "rw_ios_per_sec": 0, 00:29:02.495 "rw_mbytes_per_sec": 0, 00:29:02.495 "r_mbytes_per_sec": 0, 00:29:02.495 "w_mbytes_per_sec": 0 00:29:02.495 }, 00:29:02.495 "claimed": false, 00:29:02.495 "zoned": false, 00:29:02.495 "supported_io_types": { 00:29:02.495 "read": true, 00:29:02.495 "write": true, 00:29:02.495 "unmap": false, 00:29:02.495 "flush": false, 00:29:02.495 "reset": true, 00:29:02.495 "nvme_admin": false, 00:29:02.495 "nvme_io": false, 00:29:02.495 "nvme_io_md": false, 00:29:02.495 "write_zeroes": true, 00:29:02.495 "zcopy": false, 00:29:02.495 "get_zone_info": false, 00:29:02.495 "zone_management": false, 00:29:02.495 "zone_append": false, 00:29:02.495 "compare": false, 00:29:02.495 "compare_and_write": false, 00:29:02.495 "abort": false, 00:29:02.495 "seek_hole": false, 00:29:02.495 "seek_data": false, 00:29:02.495 "copy": false, 00:29:02.495 "nvme_iov_md": false 00:29:02.495 }, 00:29:02.495 "memory_domains": [ 00:29:02.495 { 00:29:02.495 "dma_device_id": "system", 00:29:02.495 "dma_device_type": 1 00:29:02.495 }, 00:29:02.495 { 00:29:02.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:02.495 "dma_device_type": 2 00:29:02.495 }, 00:29:02.495 { 00:29:02.495 "dma_device_id": "system", 00:29:02.495 "dma_device_type": 1 00:29:02.495 }, 00:29:02.495 { 00:29:02.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:02.495 "dma_device_type": 2 00:29:02.495 } 00:29:02.495 ], 00:29:02.495 "driver_specific": { 00:29:02.495 "raid": { 00:29:02.495 "uuid": "ee1b6098-5cf6-476a-915f-a446dc3b5e21", 00:29:02.495 "strip_size_kb": 0, 00:29:02.495 "state": "online", 00:29:02.495 "raid_level": "raid1", 00:29:02.495 "superblock": true, 00:29:02.495 "num_base_bdevs": 2, 00:29:02.495 "num_base_bdevs_discovered": 2, 00:29:02.495 "num_base_bdevs_operational": 2, 00:29:02.495 "base_bdevs_list": [ 00:29:02.495 { 00:29:02.495 "name": "pt1", 00:29:02.495 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:02.495 "is_configured": true, 00:29:02.495 "data_offset": 256, 00:29:02.495 "data_size": 7936 00:29:02.495 }, 00:29:02.495 { 00:29:02.495 "name": "pt2", 00:29:02.495 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:02.495 "is_configured": true, 00:29:02.495 "data_offset": 256, 00:29:02.495 "data_size": 7936 00:29:02.495 } 00:29:02.495 ] 00:29:02.495 } 00:29:02.495 } 00:29:02.495 }' 00:29:02.495 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:02.495 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:02.495 pt2' 00:29:02.495 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:02.495 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:02.495 13:54:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:02.753 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:02.753 "name": "pt1", 00:29:02.753 "aliases": [ 00:29:02.753 "00000000-0000-0000-0000-000000000001" 00:29:02.753 ], 00:29:02.753 "product_name": "passthru", 00:29:02.753 "block_size": 4128, 00:29:02.753 "num_blocks": 8192, 00:29:02.753 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:02.753 "md_size": 32, 00:29:02.753 "md_interleave": true, 00:29:02.753 "dif_type": 0, 00:29:02.753 "assigned_rate_limits": { 00:29:02.753 "rw_ios_per_sec": 0, 00:29:02.753 "rw_mbytes_per_sec": 0, 00:29:02.753 "r_mbytes_per_sec": 0, 00:29:02.753 "w_mbytes_per_sec": 0 00:29:02.753 }, 00:29:02.753 "claimed": true, 00:29:02.753 "claim_type": "exclusive_write", 00:29:02.753 "zoned": false, 00:29:02.753 "supported_io_types": { 00:29:02.753 "read": true, 00:29:02.753 "write": true, 00:29:02.753 "unmap": true, 00:29:02.753 "flush": true, 00:29:02.753 "reset": true, 00:29:02.753 "nvme_admin": false, 00:29:02.753 "nvme_io": false, 00:29:02.753 "nvme_io_md": false, 00:29:02.753 "write_zeroes": true, 00:29:02.753 "zcopy": true, 00:29:02.753 "get_zone_info": false, 00:29:02.753 "zone_management": false, 00:29:02.753 "zone_append": false, 00:29:02.753 "compare": false, 00:29:02.753 "compare_and_write": false, 00:29:02.753 "abort": true, 00:29:02.753 "seek_hole": false, 00:29:02.753 "seek_data": false, 00:29:02.753 "copy": true, 00:29:02.753 "nvme_iov_md": false 00:29:02.753 }, 00:29:02.753 "memory_domains": [ 00:29:02.753 { 00:29:02.753 "dma_device_id": "system", 00:29:02.753 "dma_device_type": 1 00:29:02.753 }, 00:29:02.753 { 00:29:02.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:02.753 "dma_device_type": 2 00:29:02.753 } 00:29:02.753 ], 00:29:02.753 "driver_specific": { 00:29:02.753 "passthru": { 00:29:02.753 "name": "pt1", 00:29:02.753 "base_bdev_name": "malloc1" 00:29:02.753 } 00:29:02.753 } 00:29:02.753 }' 00:29:02.753 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:02.753 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:02.753 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:02.753 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:02.753 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:03.079 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:03.079 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:03.079 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:03.079 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:03.079 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:03.079 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:03.079 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:03.079 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:03.079 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:03.079 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:03.350 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:03.350 "name": "pt2", 00:29:03.350 "aliases": [ 00:29:03.350 "00000000-0000-0000-0000-000000000002" 00:29:03.350 ], 00:29:03.350 "product_name": "passthru", 00:29:03.350 "block_size": 4128, 00:29:03.350 "num_blocks": 8192, 00:29:03.350 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:03.350 "md_size": 32, 00:29:03.350 "md_interleave": true, 00:29:03.350 "dif_type": 0, 00:29:03.350 "assigned_rate_limits": { 00:29:03.350 "rw_ios_per_sec": 0, 00:29:03.350 "rw_mbytes_per_sec": 0, 00:29:03.350 "r_mbytes_per_sec": 0, 00:29:03.350 "w_mbytes_per_sec": 0 00:29:03.350 }, 00:29:03.350 "claimed": true, 00:29:03.350 "claim_type": "exclusive_write", 00:29:03.350 "zoned": false, 00:29:03.350 "supported_io_types": { 00:29:03.350 "read": true, 00:29:03.350 "write": true, 00:29:03.350 "unmap": true, 00:29:03.350 "flush": true, 00:29:03.350 "reset": true, 00:29:03.350 "nvme_admin": false, 00:29:03.350 "nvme_io": false, 00:29:03.350 "nvme_io_md": false, 00:29:03.350 "write_zeroes": true, 00:29:03.350 "zcopy": true, 00:29:03.350 "get_zone_info": false, 00:29:03.350 "zone_management": false, 00:29:03.351 "zone_append": false, 00:29:03.351 "compare": false, 00:29:03.351 "compare_and_write": false, 00:29:03.351 "abort": true, 00:29:03.351 "seek_hole": false, 00:29:03.351 "seek_data": false, 00:29:03.351 "copy": true, 00:29:03.351 "nvme_iov_md": false 00:29:03.351 }, 00:29:03.351 "memory_domains": [ 00:29:03.351 { 00:29:03.351 "dma_device_id": "system", 00:29:03.351 "dma_device_type": 1 00:29:03.351 }, 00:29:03.351 { 00:29:03.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:03.351 "dma_device_type": 2 00:29:03.351 } 00:29:03.351 ], 00:29:03.351 "driver_specific": { 00:29:03.351 "passthru": { 00:29:03.351 "name": "pt2", 00:29:03.351 "base_bdev_name": "malloc2" 00:29:03.351 } 00:29:03.351 } 00:29:03.351 }' 00:29:03.351 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:03.351 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:03.351 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:03.351 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:03.351 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:03.609 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:03.609 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:03.609 13:54:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:03.609 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:03.609 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:03.609 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:03.609 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:03.609 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:03.609 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:29:03.984 [2024-07-12 13:54:52.359659] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:03.984 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ee1b6098-5cf6-476a-915f-a446dc3b5e21 00:29:03.984 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z ee1b6098-5cf6-476a-915f-a446dc3b5e21 ']' 00:29:03.984 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:04.243 [2024-07-12 13:54:52.608076] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:04.243 [2024-07-12 13:54:52.608099] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:04.243 [2024-07-12 13:54:52.608154] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:04.243 [2024-07-12 13:54:52.608206] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:04.243 [2024-07-12 13:54:52.608218] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd237f0 name raid_bdev1, state offline 00:29:04.243 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.243 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:29:04.501 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:29:04.501 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:29:04.501 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:04.501 13:54:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:04.758 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:04.758 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:05.016 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:05.017 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:05.274 [2024-07-12 13:54:53.823243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:05.274 [2024-07-12 13:54:53.824632] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:05.274 [2024-07-12 13:54:53.824686] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:05.274 [2024-07-12 13:54:53.824727] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:05.274 [2024-07-12 13:54:53.824745] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:05.274 [2024-07-12 13:54:53.824755] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd2d630 name raid_bdev1, state configuring 00:29:05.274 request: 00:29:05.274 { 00:29:05.274 "name": "raid_bdev1", 00:29:05.274 "raid_level": "raid1", 00:29:05.274 "base_bdevs": [ 00:29:05.274 "malloc1", 00:29:05.274 "malloc2" 00:29:05.274 ], 00:29:05.274 "superblock": false, 00:29:05.274 "method": "bdev_raid_create", 00:29:05.274 "req_id": 1 00:29:05.274 } 00:29:05.274 Got JSON-RPC error response 00:29:05.274 response: 00:29:05.274 { 00:29:05.274 "code": -17, 00:29:05.274 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:05.274 } 00:29:05.274 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:05.274 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:05.274 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:05.274 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:05.274 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:05.274 13:54:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:29:05.532 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:29:05.532 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:29:05.532 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:05.790 [2024-07-12 13:54:54.316494] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:05.790 [2024-07-12 13:54:54.316535] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:05.790 [2024-07-12 13:54:54.316552] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd2d390 00:29:05.790 [2024-07-12 13:54:54.316564] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:05.790 [2024-07-12 13:54:54.317963] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:05.790 [2024-07-12 13:54:54.317990] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:05.790 [2024-07-12 13:54:54.318033] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:05.790 [2024-07-12 13:54:54.318057] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:05.790 pt1 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:05.790 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.049 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:06.049 "name": "raid_bdev1", 00:29:06.049 "uuid": "ee1b6098-5cf6-476a-915f-a446dc3b5e21", 00:29:06.049 "strip_size_kb": 0, 00:29:06.049 "state": "configuring", 00:29:06.049 "raid_level": "raid1", 00:29:06.049 "superblock": true, 00:29:06.049 "num_base_bdevs": 2, 00:29:06.049 "num_base_bdevs_discovered": 1, 00:29:06.049 "num_base_bdevs_operational": 2, 00:29:06.049 "base_bdevs_list": [ 00:29:06.050 { 00:29:06.050 "name": "pt1", 00:29:06.050 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:06.050 "is_configured": true, 00:29:06.050 "data_offset": 256, 00:29:06.050 "data_size": 7936 00:29:06.050 }, 00:29:06.050 { 00:29:06.050 "name": null, 00:29:06.050 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:06.050 "is_configured": false, 00:29:06.050 "data_offset": 256, 00:29:06.050 "data_size": 7936 00:29:06.050 } 00:29:06.050 ] 00:29:06.050 }' 00:29:06.050 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:06.050 13:54:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:06.988 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:29:06.988 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:29:06.988 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:06.988 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:07.247 [2024-07-12 13:54:55.724241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:07.247 [2024-07-12 13:54:55.724292] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:07.247 [2024-07-12 13:54:55.724310] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd225e0 00:29:07.247 [2024-07-12 13:54:55.724322] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:07.247 [2024-07-12 13:54:55.724484] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:07.247 [2024-07-12 13:54:55.724502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:07.247 [2024-07-12 13:54:55.724546] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:07.247 [2024-07-12 13:54:55.724565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:07.247 [2024-07-12 13:54:55.724648] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd25060 00:29:07.247 [2024-07-12 13:54:55.724659] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:07.247 [2024-07-12 13:54:55.724713] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd26200 00:29:07.247 [2024-07-12 13:54:55.724787] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd25060 00:29:07.247 [2024-07-12 13:54:55.724798] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd25060 00:29:07.247 [2024-07-12 13:54:55.724856] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:07.247 pt2 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.247 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.506 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:07.506 "name": "raid_bdev1", 00:29:07.506 "uuid": "ee1b6098-5cf6-476a-915f-a446dc3b5e21", 00:29:07.506 "strip_size_kb": 0, 00:29:07.506 "state": "online", 00:29:07.506 "raid_level": "raid1", 00:29:07.506 "superblock": true, 00:29:07.506 "num_base_bdevs": 2, 00:29:07.506 "num_base_bdevs_discovered": 2, 00:29:07.506 "num_base_bdevs_operational": 2, 00:29:07.506 "base_bdevs_list": [ 00:29:07.506 { 00:29:07.506 "name": "pt1", 00:29:07.506 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:07.506 "is_configured": true, 00:29:07.506 "data_offset": 256, 00:29:07.506 "data_size": 7936 00:29:07.506 }, 00:29:07.506 { 00:29:07.506 "name": "pt2", 00:29:07.506 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:07.506 "is_configured": true, 00:29:07.506 "data_offset": 256, 00:29:07.506 "data_size": 7936 00:29:07.506 } 00:29:07.506 ] 00:29:07.506 }' 00:29:07.506 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:07.506 13:54:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:08.444 13:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:29:08.444 13:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:08.444 13:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:08.444 13:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:08.444 13:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:08.444 13:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:08.444 13:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:08.444 13:54:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:08.701 [2024-07-12 13:54:57.224484] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:08.701 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:08.701 "name": "raid_bdev1", 00:29:08.701 "aliases": [ 00:29:08.702 "ee1b6098-5cf6-476a-915f-a446dc3b5e21" 00:29:08.702 ], 00:29:08.702 "product_name": "Raid Volume", 00:29:08.702 "block_size": 4128, 00:29:08.702 "num_blocks": 7936, 00:29:08.702 "uuid": "ee1b6098-5cf6-476a-915f-a446dc3b5e21", 00:29:08.702 "md_size": 32, 00:29:08.702 "md_interleave": true, 00:29:08.702 "dif_type": 0, 00:29:08.702 "assigned_rate_limits": { 00:29:08.702 "rw_ios_per_sec": 0, 00:29:08.702 "rw_mbytes_per_sec": 0, 00:29:08.702 "r_mbytes_per_sec": 0, 00:29:08.702 "w_mbytes_per_sec": 0 00:29:08.702 }, 00:29:08.702 "claimed": false, 00:29:08.702 "zoned": false, 00:29:08.702 "supported_io_types": { 00:29:08.702 "read": true, 00:29:08.702 "write": true, 00:29:08.702 "unmap": false, 00:29:08.702 "flush": false, 00:29:08.702 "reset": true, 00:29:08.702 "nvme_admin": false, 00:29:08.702 "nvme_io": false, 00:29:08.702 "nvme_io_md": false, 00:29:08.702 "write_zeroes": true, 00:29:08.702 "zcopy": false, 00:29:08.702 "get_zone_info": false, 00:29:08.702 "zone_management": false, 00:29:08.702 "zone_append": false, 00:29:08.702 "compare": false, 00:29:08.702 "compare_and_write": false, 00:29:08.702 "abort": false, 00:29:08.702 "seek_hole": false, 00:29:08.702 "seek_data": false, 00:29:08.702 "copy": false, 00:29:08.702 "nvme_iov_md": false 00:29:08.702 }, 00:29:08.702 "memory_domains": [ 00:29:08.702 { 00:29:08.702 "dma_device_id": "system", 00:29:08.702 "dma_device_type": 1 00:29:08.702 }, 00:29:08.702 { 00:29:08.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:08.702 "dma_device_type": 2 00:29:08.702 }, 00:29:08.702 { 00:29:08.702 "dma_device_id": "system", 00:29:08.702 "dma_device_type": 1 00:29:08.702 }, 00:29:08.702 { 00:29:08.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:08.702 "dma_device_type": 2 00:29:08.702 } 00:29:08.702 ], 00:29:08.702 "driver_specific": { 00:29:08.702 "raid": { 00:29:08.702 "uuid": "ee1b6098-5cf6-476a-915f-a446dc3b5e21", 00:29:08.702 "strip_size_kb": 0, 00:29:08.702 "state": "online", 00:29:08.702 "raid_level": "raid1", 00:29:08.702 "superblock": true, 00:29:08.702 "num_base_bdevs": 2, 00:29:08.702 "num_base_bdevs_discovered": 2, 00:29:08.702 "num_base_bdevs_operational": 2, 00:29:08.702 "base_bdevs_list": [ 00:29:08.702 { 00:29:08.702 "name": "pt1", 00:29:08.702 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:08.702 "is_configured": true, 00:29:08.702 "data_offset": 256, 00:29:08.702 "data_size": 7936 00:29:08.702 }, 00:29:08.702 { 00:29:08.702 "name": "pt2", 00:29:08.702 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:08.702 "is_configured": true, 00:29:08.702 "data_offset": 256, 00:29:08.702 "data_size": 7936 00:29:08.702 } 00:29:08.702 ] 00:29:08.702 } 00:29:08.702 } 00:29:08.702 }' 00:29:08.702 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:08.961 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:08.961 pt2' 00:29:08.961 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:08.961 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:08.961 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:09.220 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:09.220 "name": "pt1", 00:29:09.220 "aliases": [ 00:29:09.220 "00000000-0000-0000-0000-000000000001" 00:29:09.220 ], 00:29:09.220 "product_name": "passthru", 00:29:09.220 "block_size": 4128, 00:29:09.220 "num_blocks": 8192, 00:29:09.220 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:09.220 "md_size": 32, 00:29:09.220 "md_interleave": true, 00:29:09.220 "dif_type": 0, 00:29:09.220 "assigned_rate_limits": { 00:29:09.220 "rw_ios_per_sec": 0, 00:29:09.220 "rw_mbytes_per_sec": 0, 00:29:09.220 "r_mbytes_per_sec": 0, 00:29:09.220 "w_mbytes_per_sec": 0 00:29:09.220 }, 00:29:09.220 "claimed": true, 00:29:09.220 "claim_type": "exclusive_write", 00:29:09.220 "zoned": false, 00:29:09.220 "supported_io_types": { 00:29:09.220 "read": true, 00:29:09.220 "write": true, 00:29:09.220 "unmap": true, 00:29:09.220 "flush": true, 00:29:09.220 "reset": true, 00:29:09.220 "nvme_admin": false, 00:29:09.220 "nvme_io": false, 00:29:09.220 "nvme_io_md": false, 00:29:09.220 "write_zeroes": true, 00:29:09.220 "zcopy": true, 00:29:09.220 "get_zone_info": false, 00:29:09.220 "zone_management": false, 00:29:09.220 "zone_append": false, 00:29:09.220 "compare": false, 00:29:09.220 "compare_and_write": false, 00:29:09.220 "abort": true, 00:29:09.220 "seek_hole": false, 00:29:09.220 "seek_data": false, 00:29:09.220 "copy": true, 00:29:09.220 "nvme_iov_md": false 00:29:09.220 }, 00:29:09.220 "memory_domains": [ 00:29:09.220 { 00:29:09.220 "dma_device_id": "system", 00:29:09.220 "dma_device_type": 1 00:29:09.220 }, 00:29:09.220 { 00:29:09.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:09.220 "dma_device_type": 2 00:29:09.220 } 00:29:09.220 ], 00:29:09.220 "driver_specific": { 00:29:09.220 "passthru": { 00:29:09.220 "name": "pt1", 00:29:09.220 "base_bdev_name": "malloc1" 00:29:09.220 } 00:29:09.220 } 00:29:09.220 }' 00:29:09.220 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:09.220 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:09.220 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:09.220 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:09.220 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:09.220 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:09.220 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:09.220 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:09.479 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:09.479 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:09.479 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:09.479 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:09.479 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:09.479 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:09.479 13:54:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:09.744 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:09.744 "name": "pt2", 00:29:09.744 "aliases": [ 00:29:09.744 "00000000-0000-0000-0000-000000000002" 00:29:09.744 ], 00:29:09.744 "product_name": "passthru", 00:29:09.744 "block_size": 4128, 00:29:09.744 "num_blocks": 8192, 00:29:09.744 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:09.744 "md_size": 32, 00:29:09.744 "md_interleave": true, 00:29:09.744 "dif_type": 0, 00:29:09.744 "assigned_rate_limits": { 00:29:09.744 "rw_ios_per_sec": 0, 00:29:09.744 "rw_mbytes_per_sec": 0, 00:29:09.744 "r_mbytes_per_sec": 0, 00:29:09.744 "w_mbytes_per_sec": 0 00:29:09.744 }, 00:29:09.744 "claimed": true, 00:29:09.744 "claim_type": "exclusive_write", 00:29:09.744 "zoned": false, 00:29:09.744 "supported_io_types": { 00:29:09.744 "read": true, 00:29:09.744 "write": true, 00:29:09.744 "unmap": true, 00:29:09.744 "flush": true, 00:29:09.744 "reset": true, 00:29:09.744 "nvme_admin": false, 00:29:09.744 "nvme_io": false, 00:29:09.744 "nvme_io_md": false, 00:29:09.744 "write_zeroes": true, 00:29:09.744 "zcopy": true, 00:29:09.744 "get_zone_info": false, 00:29:09.744 "zone_management": false, 00:29:09.744 "zone_append": false, 00:29:09.744 "compare": false, 00:29:09.744 "compare_and_write": false, 00:29:09.744 "abort": true, 00:29:09.744 "seek_hole": false, 00:29:09.744 "seek_data": false, 00:29:09.744 "copy": true, 00:29:09.744 "nvme_iov_md": false 00:29:09.744 }, 00:29:09.744 "memory_domains": [ 00:29:09.744 { 00:29:09.744 "dma_device_id": "system", 00:29:09.744 "dma_device_type": 1 00:29:09.744 }, 00:29:09.744 { 00:29:09.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:09.744 "dma_device_type": 2 00:29:09.744 } 00:29:09.744 ], 00:29:09.744 "driver_specific": { 00:29:09.744 "passthru": { 00:29:09.744 "name": "pt2", 00:29:09.744 "base_bdev_name": "malloc2" 00:29:09.744 } 00:29:09.744 } 00:29:09.744 }' 00:29:09.744 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:09.744 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:09.744 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:09.744 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:09.744 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:10.003 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:10.003 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:10.003 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:10.003 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:10.003 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:10.003 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:10.003 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:10.003 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:10.003 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:29:10.262 [2024-07-12 13:54:58.744501] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:10.262 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' ee1b6098-5cf6-476a-915f-a446dc3b5e21 '!=' ee1b6098-5cf6-476a-915f-a446dc3b5e21 ']' 00:29:10.262 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:29:10.262 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:10.262 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:10.262 13:54:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:10.521 [2024-07-12 13:54:58.988911] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.521 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:10.781 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:10.781 "name": "raid_bdev1", 00:29:10.781 "uuid": "ee1b6098-5cf6-476a-915f-a446dc3b5e21", 00:29:10.781 "strip_size_kb": 0, 00:29:10.781 "state": "online", 00:29:10.781 "raid_level": "raid1", 00:29:10.781 "superblock": true, 00:29:10.781 "num_base_bdevs": 2, 00:29:10.781 "num_base_bdevs_discovered": 1, 00:29:10.781 "num_base_bdevs_operational": 1, 00:29:10.781 "base_bdevs_list": [ 00:29:10.781 { 00:29:10.781 "name": null, 00:29:10.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:10.781 "is_configured": false, 00:29:10.781 "data_offset": 256, 00:29:10.781 "data_size": 7936 00:29:10.781 }, 00:29:10.781 { 00:29:10.781 "name": "pt2", 00:29:10.781 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:10.781 "is_configured": true, 00:29:10.781 "data_offset": 256, 00:29:10.781 "data_size": 7936 00:29:10.781 } 00:29:10.781 ] 00:29:10.781 }' 00:29:10.781 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:10.781 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:11.345 13:54:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:11.603 [2024-07-12 13:55:00.079796] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:11.603 [2024-07-12 13:55:00.079829] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:11.603 [2024-07-12 13:55:00.079882] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:11.603 [2024-07-12 13:55:00.079924] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:11.603 [2024-07-12 13:55:00.079968] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd25060 name raid_bdev1, state offline 00:29:11.603 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.603 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:29:11.860 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:29:11.860 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:29:11.860 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:29:11.860 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:11.860 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:12.118 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:29:12.118 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:12.118 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:29:12.118 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:29:12.118 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:29:12.118 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:12.380 [2024-07-12 13:55:00.773605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:12.380 [2024-07-12 13:55:00.773653] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:12.380 [2024-07-12 13:55:00.773670] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd22810 00:29:12.380 [2024-07-12 13:55:00.773683] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:12.380 [2024-07-12 13:55:00.775108] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:12.380 [2024-07-12 13:55:00.775137] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:12.380 [2024-07-12 13:55:00.775184] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:12.380 [2024-07-12 13:55:00.775211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:12.380 [2024-07-12 13:55:00.775277] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd26250 00:29:12.380 [2024-07-12 13:55:00.775287] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:12.380 [2024-07-12 13:55:00.775342] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd3d370 00:29:12.380 [2024-07-12 13:55:00.775415] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd26250 00:29:12.380 [2024-07-12 13:55:00.775425] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd26250 00:29:12.380 [2024-07-12 13:55:00.775481] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:12.380 pt2 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.380 13:55:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:12.646 13:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:12.646 "name": "raid_bdev1", 00:29:12.646 "uuid": "ee1b6098-5cf6-476a-915f-a446dc3b5e21", 00:29:12.646 "strip_size_kb": 0, 00:29:12.646 "state": "online", 00:29:12.646 "raid_level": "raid1", 00:29:12.646 "superblock": true, 00:29:12.646 "num_base_bdevs": 2, 00:29:12.646 "num_base_bdevs_discovered": 1, 00:29:12.646 "num_base_bdevs_operational": 1, 00:29:12.646 "base_bdevs_list": [ 00:29:12.646 { 00:29:12.646 "name": null, 00:29:12.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:12.646 "is_configured": false, 00:29:12.646 "data_offset": 256, 00:29:12.646 "data_size": 7936 00:29:12.646 }, 00:29:12.646 { 00:29:12.646 "name": "pt2", 00:29:12.647 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:12.647 "is_configured": true, 00:29:12.647 "data_offset": 256, 00:29:12.647 "data_size": 7936 00:29:12.647 } 00:29:12.647 ] 00:29:12.647 }' 00:29:12.647 13:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:12.647 13:55:01 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:13.213 13:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:13.471 [2024-07-12 13:55:01.832416] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:13.471 [2024-07-12 13:55:01.832442] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:13.471 [2024-07-12 13:55:01.832495] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:13.471 [2024-07-12 13:55:01.832538] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:13.471 [2024-07-12 13:55:01.832550] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd26250 name raid_bdev1, state offline 00:29:13.471 13:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.471 13:55:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:29:13.729 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:29:13.729 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:29:13.729 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:29:13.729 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:13.986 [2024-07-12 13:55:02.321689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:13.986 [2024-07-12 13:55:02.321735] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:13.986 [2024-07-12 13:55:02.321753] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd24cc0 00:29:13.986 [2024-07-12 13:55:02.321766] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:13.986 [2024-07-12 13:55:02.323242] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:13.986 [2024-07-12 13:55:02.323272] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:13.986 [2024-07-12 13:55:02.323319] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:13.986 [2024-07-12 13:55:02.323344] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:13.986 [2024-07-12 13:55:02.323424] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:13.986 [2024-07-12 13:55:02.323437] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:13.987 [2024-07-12 13:55:02.323453] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd25bb0 name raid_bdev1, state configuring 00:29:13.987 [2024-07-12 13:55:02.323476] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:13.987 [2024-07-12 13:55:02.323529] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd26f60 00:29:13.987 [2024-07-12 13:55:02.323539] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:13.987 [2024-07-12 13:55:02.323595] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd25600 00:29:13.987 [2024-07-12 13:55:02.323665] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd26f60 00:29:13.987 [2024-07-12 13:55:02.323675] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd26f60 00:29:13.987 [2024-07-12 13:55:02.323734] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:13.987 pt1 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.987 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:14.244 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:14.244 "name": "raid_bdev1", 00:29:14.244 "uuid": "ee1b6098-5cf6-476a-915f-a446dc3b5e21", 00:29:14.244 "strip_size_kb": 0, 00:29:14.244 "state": "online", 00:29:14.244 "raid_level": "raid1", 00:29:14.244 "superblock": true, 00:29:14.244 "num_base_bdevs": 2, 00:29:14.244 "num_base_bdevs_discovered": 1, 00:29:14.244 "num_base_bdevs_operational": 1, 00:29:14.244 "base_bdevs_list": [ 00:29:14.244 { 00:29:14.244 "name": null, 00:29:14.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:14.244 "is_configured": false, 00:29:14.244 "data_offset": 256, 00:29:14.244 "data_size": 7936 00:29:14.244 }, 00:29:14.244 { 00:29:14.244 "name": "pt2", 00:29:14.244 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:14.244 "is_configured": true, 00:29:14.244 "data_offset": 256, 00:29:14.244 "data_size": 7936 00:29:14.244 } 00:29:14.244 ] 00:29:14.244 }' 00:29:14.244 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:14.244 13:55:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:14.813 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:14.813 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:15.071 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:29:15.071 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:15.071 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:29:15.330 [2024-07-12 13:55:03.665491] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' ee1b6098-5cf6-476a-915f-a446dc3b5e21 '!=' ee1b6098-5cf6-476a-915f-a446dc3b5e21 ']' 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 590646 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 590646 ']' 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 590646 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 590646 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 590646' 00:29:15.330 killing process with pid 590646 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 590646 00:29:15.330 [2024-07-12 13:55:03.741835] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:15.330 [2024-07-12 13:55:03.741888] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:15.330 [2024-07-12 13:55:03.741937] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:15.330 [2024-07-12 13:55:03.741949] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd26f60 name raid_bdev1, state offline 00:29:15.330 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 590646 00:29:15.330 [2024-07-12 13:55:03.760175] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:15.589 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:29:15.589 00:29:15.589 real 0m16.454s 00:29:15.589 user 0m29.850s 00:29:15.589 sys 0m3.060s 00:29:15.589 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:15.589 13:55:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:15.589 ************************************ 00:29:15.589 END TEST raid_superblock_test_md_interleaved 00:29:15.589 ************************************ 00:29:15.589 13:55:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:15.589 13:55:04 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:29:15.589 13:55:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:15.589 13:55:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:15.589 13:55:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:15.589 ************************************ 00:29:15.589 START TEST raid_rebuild_test_sb_md_interleaved 00:29:15.589 ************************************ 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:29:15.589 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=593074 00:29:15.590 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 593074 /var/tmp/spdk-raid.sock 00:29:15.590 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:15.590 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 593074 ']' 00:29:15.590 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:15.590 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:15.590 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:15.590 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:15.590 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:15.590 13:55:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:15.590 [2024-07-12 13:55:04.140059] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:15.590 [2024-07-12 13:55:04.140126] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid593074 ] 00:29:15.590 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:15.590 Zero copy mechanism will not be used. 00:29:15.848 [2024-07-12 13:55:04.252056] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:15.848 [2024-07-12 13:55:04.359858] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:15.848 [2024-07-12 13:55:04.428490] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:15.848 [2024-07-12 13:55:04.428528] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:16.432 13:55:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:16.432 13:55:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:16.432 13:55:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:16.432 13:55:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:29:16.692 BaseBdev1_malloc 00:29:16.692 13:55:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:16.951 [2024-07-12 13:55:05.482687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:16.951 [2024-07-12 13:55:05.482737] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:16.951 [2024-07-12 13:55:05.482762] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2605620 00:29:16.951 [2024-07-12 13:55:05.482775] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:16.951 [2024-07-12 13:55:05.484253] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:16.951 [2024-07-12 13:55:05.484285] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:16.951 BaseBdev1 00:29:16.951 13:55:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:16.951 13:55:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:29:17.210 BaseBdev2_malloc 00:29:17.210 13:55:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:17.468 [2024-07-12 13:55:05.985204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:17.468 [2024-07-12 13:55:05.985252] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:17.468 [2024-07-12 13:55:05.985274] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25fcc10 00:29:17.468 [2024-07-12 13:55:05.985287] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:17.468 [2024-07-12 13:55:05.986941] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:17.468 [2024-07-12 13:55:05.986971] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:17.468 BaseBdev2 00:29:17.468 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:29:17.725 spare_malloc 00:29:17.725 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:17.982 spare_delay 00:29:17.982 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:18.240 [2024-07-12 13:55:06.659833] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:18.240 [2024-07-12 13:55:06.659883] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:18.240 [2024-07-12 13:55:06.659907] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25ff9b0 00:29:18.240 [2024-07-12 13:55:06.659920] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:18.240 [2024-07-12 13:55:06.661290] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:18.240 [2024-07-12 13:55:06.661319] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:18.240 spare 00:29:18.240 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:18.499 [2024-07-12 13:55:06.900495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:18.499 [2024-07-12 13:55:06.901741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:18.499 [2024-07-12 13:55:06.901903] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2601cb0 00:29:18.499 [2024-07-12 13:55:06.901916] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:18.499 [2024-07-12 13:55:06.902002] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2468300 00:29:18.499 [2024-07-12 13:55:06.902087] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2601cb0 00:29:18.499 [2024-07-12 13:55:06.902097] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2601cb0 00:29:18.499 [2024-07-12 13:55:06.902154] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.499 13:55:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:18.757 13:55:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:18.757 "name": "raid_bdev1", 00:29:18.757 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:18.757 "strip_size_kb": 0, 00:29:18.757 "state": "online", 00:29:18.757 "raid_level": "raid1", 00:29:18.757 "superblock": true, 00:29:18.757 "num_base_bdevs": 2, 00:29:18.757 "num_base_bdevs_discovered": 2, 00:29:18.757 "num_base_bdevs_operational": 2, 00:29:18.757 "base_bdevs_list": [ 00:29:18.757 { 00:29:18.757 "name": "BaseBdev1", 00:29:18.757 "uuid": "43d2a313-b395-508b-a593-fcefda690907", 00:29:18.757 "is_configured": true, 00:29:18.757 "data_offset": 256, 00:29:18.757 "data_size": 7936 00:29:18.757 }, 00:29:18.758 { 00:29:18.758 "name": "BaseBdev2", 00:29:18.758 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:18.758 "is_configured": true, 00:29:18.758 "data_offset": 256, 00:29:18.758 "data_size": 7936 00:29:18.758 } 00:29:18.758 ] 00:29:18.758 }' 00:29:18.758 13:55:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:18.758 13:55:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:19.323 13:55:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:19.323 13:55:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:29:19.581 [2024-07-12 13:55:08.007660] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:19.581 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:29:19.581 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:19.581 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:19.839 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:29:19.839 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:29:19.839 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:29:19.839 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:20.098 [2024-07-12 13:55:08.508722] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.098 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:20.357 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:20.357 "name": "raid_bdev1", 00:29:20.357 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:20.357 "strip_size_kb": 0, 00:29:20.357 "state": "online", 00:29:20.357 "raid_level": "raid1", 00:29:20.357 "superblock": true, 00:29:20.357 "num_base_bdevs": 2, 00:29:20.357 "num_base_bdevs_discovered": 1, 00:29:20.357 "num_base_bdevs_operational": 1, 00:29:20.357 "base_bdevs_list": [ 00:29:20.357 { 00:29:20.357 "name": null, 00:29:20.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:20.357 "is_configured": false, 00:29:20.357 "data_offset": 256, 00:29:20.357 "data_size": 7936 00:29:20.357 }, 00:29:20.357 { 00:29:20.357 "name": "BaseBdev2", 00:29:20.357 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:20.357 "is_configured": true, 00:29:20.357 "data_offset": 256, 00:29:20.357 "data_size": 7936 00:29:20.357 } 00:29:20.357 ] 00:29:20.357 }' 00:29:20.357 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:20.357 13:55:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:20.924 13:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:21.182 [2024-07-12 13:55:09.591606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:21.182 [2024-07-12 13:55:09.595240] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2601b90 00:29:21.182 [2024-07-12 13:55:09.597269] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:21.182 13:55:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:22.117 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:22.117 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:22.117 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:22.117 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:22.117 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:22.117 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:22.117 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:22.375 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:22.375 "name": "raid_bdev1", 00:29:22.375 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:22.375 "strip_size_kb": 0, 00:29:22.375 "state": "online", 00:29:22.375 "raid_level": "raid1", 00:29:22.375 "superblock": true, 00:29:22.375 "num_base_bdevs": 2, 00:29:22.375 "num_base_bdevs_discovered": 2, 00:29:22.375 "num_base_bdevs_operational": 2, 00:29:22.375 "process": { 00:29:22.375 "type": "rebuild", 00:29:22.375 "target": "spare", 00:29:22.375 "progress": { 00:29:22.375 "blocks": 3072, 00:29:22.375 "percent": 38 00:29:22.375 } 00:29:22.375 }, 00:29:22.375 "base_bdevs_list": [ 00:29:22.375 { 00:29:22.375 "name": "spare", 00:29:22.375 "uuid": "474a23a4-ca97-5484-a4c9-3e9df148075c", 00:29:22.375 "is_configured": true, 00:29:22.375 "data_offset": 256, 00:29:22.375 "data_size": 7936 00:29:22.375 }, 00:29:22.375 { 00:29:22.376 "name": "BaseBdev2", 00:29:22.376 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:22.376 "is_configured": true, 00:29:22.376 "data_offset": 256, 00:29:22.376 "data_size": 7936 00:29:22.376 } 00:29:22.376 ] 00:29:22.376 }' 00:29:22.376 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:22.376 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:22.376 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:22.376 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:22.376 13:55:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:22.633 [2024-07-12 13:55:11.179653] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:22.633 [2024-07-12 13:55:11.210077] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:22.633 [2024-07-12 13:55:11.210124] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:22.633 [2024-07-12 13:55:11.210140] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:22.633 [2024-07-12 13:55:11.210148] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:22.892 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.150 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:23.150 "name": "raid_bdev1", 00:29:23.150 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:23.150 "strip_size_kb": 0, 00:29:23.150 "state": "online", 00:29:23.150 "raid_level": "raid1", 00:29:23.150 "superblock": true, 00:29:23.150 "num_base_bdevs": 2, 00:29:23.150 "num_base_bdevs_discovered": 1, 00:29:23.150 "num_base_bdevs_operational": 1, 00:29:23.150 "base_bdevs_list": [ 00:29:23.150 { 00:29:23.150 "name": null, 00:29:23.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.150 "is_configured": false, 00:29:23.150 "data_offset": 256, 00:29:23.150 "data_size": 7936 00:29:23.150 }, 00:29:23.150 { 00:29:23.150 "name": "BaseBdev2", 00:29:23.150 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:23.150 "is_configured": true, 00:29:23.150 "data_offset": 256, 00:29:23.150 "data_size": 7936 00:29:23.150 } 00:29:23.150 ] 00:29:23.150 }' 00:29:23.150 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:23.150 13:55:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:23.718 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:23.718 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:23.718 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:23.718 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:23.718 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:23.718 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.718 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:23.977 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:23.977 "name": "raid_bdev1", 00:29:23.977 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:23.977 "strip_size_kb": 0, 00:29:23.977 "state": "online", 00:29:23.977 "raid_level": "raid1", 00:29:23.977 "superblock": true, 00:29:23.977 "num_base_bdevs": 2, 00:29:23.977 "num_base_bdevs_discovered": 1, 00:29:23.977 "num_base_bdevs_operational": 1, 00:29:23.977 "base_bdevs_list": [ 00:29:23.977 { 00:29:23.977 "name": null, 00:29:23.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.977 "is_configured": false, 00:29:23.977 "data_offset": 256, 00:29:23.977 "data_size": 7936 00:29:23.977 }, 00:29:23.977 { 00:29:23.977 "name": "BaseBdev2", 00:29:23.977 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:23.977 "is_configured": true, 00:29:23.977 "data_offset": 256, 00:29:23.977 "data_size": 7936 00:29:23.977 } 00:29:23.977 ] 00:29:23.977 }' 00:29:23.977 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:23.977 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:23.977 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:23.977 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:23.977 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:24.235 [2024-07-12 13:55:12.657875] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:24.235 [2024-07-12 13:55:12.662060] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25fdbb0 00:29:24.235 [2024-07-12 13:55:12.663574] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:24.235 13:55:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:25.169 13:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:25.169 13:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:25.169 13:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:25.169 13:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:25.169 13:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:25.169 13:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:25.169 13:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:25.428 13:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:25.428 "name": "raid_bdev1", 00:29:25.428 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:25.428 "strip_size_kb": 0, 00:29:25.428 "state": "online", 00:29:25.428 "raid_level": "raid1", 00:29:25.428 "superblock": true, 00:29:25.428 "num_base_bdevs": 2, 00:29:25.428 "num_base_bdevs_discovered": 2, 00:29:25.428 "num_base_bdevs_operational": 2, 00:29:25.428 "process": { 00:29:25.428 "type": "rebuild", 00:29:25.428 "target": "spare", 00:29:25.428 "progress": { 00:29:25.428 "blocks": 3072, 00:29:25.428 "percent": 38 00:29:25.428 } 00:29:25.428 }, 00:29:25.428 "base_bdevs_list": [ 00:29:25.428 { 00:29:25.428 "name": "spare", 00:29:25.428 "uuid": "474a23a4-ca97-5484-a4c9-3e9df148075c", 00:29:25.428 "is_configured": true, 00:29:25.428 "data_offset": 256, 00:29:25.428 "data_size": 7936 00:29:25.428 }, 00:29:25.428 { 00:29:25.428 "name": "BaseBdev2", 00:29:25.428 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:25.428 "is_configured": true, 00:29:25.428 "data_offset": 256, 00:29:25.428 "data_size": 7936 00:29:25.428 } 00:29:25.428 ] 00:29:25.428 }' 00:29:25.428 13:55:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:25.428 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:25.428 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:29:25.687 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1178 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:25.687 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:26.252 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:26.252 "name": "raid_bdev1", 00:29:26.252 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:26.252 "strip_size_kb": 0, 00:29:26.252 "state": "online", 00:29:26.252 "raid_level": "raid1", 00:29:26.252 "superblock": true, 00:29:26.252 "num_base_bdevs": 2, 00:29:26.252 "num_base_bdevs_discovered": 2, 00:29:26.252 "num_base_bdevs_operational": 2, 00:29:26.252 "process": { 00:29:26.252 "type": "rebuild", 00:29:26.252 "target": "spare", 00:29:26.252 "progress": { 00:29:26.252 "blocks": 4608, 00:29:26.252 "percent": 58 00:29:26.252 } 00:29:26.252 }, 00:29:26.252 "base_bdevs_list": [ 00:29:26.252 { 00:29:26.252 "name": "spare", 00:29:26.252 "uuid": "474a23a4-ca97-5484-a4c9-3e9df148075c", 00:29:26.252 "is_configured": true, 00:29:26.252 "data_offset": 256, 00:29:26.252 "data_size": 7936 00:29:26.252 }, 00:29:26.252 { 00:29:26.252 "name": "BaseBdev2", 00:29:26.252 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:26.252 "is_configured": true, 00:29:26.252 "data_offset": 256, 00:29:26.252 "data_size": 7936 00:29:26.252 } 00:29:26.252 ] 00:29:26.252 }' 00:29:26.252 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:26.252 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:26.252 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:26.252 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:26.252 13:55:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:27.211 13:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:27.211 13:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:27.211 13:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:27.211 13:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:27.211 13:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:27.211 13:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:27.211 13:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.211 13:55:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:27.211 [2024-07-12 13:55:15.787690] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:27.211 [2024-07-12 13:55:15.787751] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:27.211 [2024-07-12 13:55:15.787836] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:27.810 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:27.810 "name": "raid_bdev1", 00:29:27.810 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:27.810 "strip_size_kb": 0, 00:29:27.810 "state": "online", 00:29:27.810 "raid_level": "raid1", 00:29:27.810 "superblock": true, 00:29:27.810 "num_base_bdevs": 2, 00:29:27.810 "num_base_bdevs_discovered": 2, 00:29:27.810 "num_base_bdevs_operational": 2, 00:29:27.811 "base_bdevs_list": [ 00:29:27.811 { 00:29:27.811 "name": "spare", 00:29:27.811 "uuid": "474a23a4-ca97-5484-a4c9-3e9df148075c", 00:29:27.811 "is_configured": true, 00:29:27.811 "data_offset": 256, 00:29:27.811 "data_size": 7936 00:29:27.811 }, 00:29:27.811 { 00:29:27.811 "name": "BaseBdev2", 00:29:27.811 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:27.811 "is_configured": true, 00:29:27.811 "data_offset": 256, 00:29:27.811 "data_size": 7936 00:29:27.811 } 00:29:27.811 ] 00:29:27.811 }' 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.811 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:28.068 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:28.069 "name": "raid_bdev1", 00:29:28.069 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:28.069 "strip_size_kb": 0, 00:29:28.069 "state": "online", 00:29:28.069 "raid_level": "raid1", 00:29:28.069 "superblock": true, 00:29:28.069 "num_base_bdevs": 2, 00:29:28.069 "num_base_bdevs_discovered": 2, 00:29:28.069 "num_base_bdevs_operational": 2, 00:29:28.069 "base_bdevs_list": [ 00:29:28.069 { 00:29:28.069 "name": "spare", 00:29:28.069 "uuid": "474a23a4-ca97-5484-a4c9-3e9df148075c", 00:29:28.069 "is_configured": true, 00:29:28.069 "data_offset": 256, 00:29:28.069 "data_size": 7936 00:29:28.069 }, 00:29:28.069 { 00:29:28.069 "name": "BaseBdev2", 00:29:28.069 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:28.069 "is_configured": true, 00:29:28.069 "data_offset": 256, 00:29:28.069 "data_size": 7936 00:29:28.069 } 00:29:28.069 ] 00:29:28.069 }' 00:29:28.069 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:28.327 13:55:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:28.587 13:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:28.587 "name": "raid_bdev1", 00:29:28.587 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:28.587 "strip_size_kb": 0, 00:29:28.587 "state": "online", 00:29:28.587 "raid_level": "raid1", 00:29:28.587 "superblock": true, 00:29:28.587 "num_base_bdevs": 2, 00:29:28.587 "num_base_bdevs_discovered": 2, 00:29:28.587 "num_base_bdevs_operational": 2, 00:29:28.587 "base_bdevs_list": [ 00:29:28.587 { 00:29:28.587 "name": "spare", 00:29:28.587 "uuid": "474a23a4-ca97-5484-a4c9-3e9df148075c", 00:29:28.587 "is_configured": true, 00:29:28.587 "data_offset": 256, 00:29:28.587 "data_size": 7936 00:29:28.587 }, 00:29:28.587 { 00:29:28.587 "name": "BaseBdev2", 00:29:28.587 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:28.588 "is_configured": true, 00:29:28.588 "data_offset": 256, 00:29:28.588 "data_size": 7936 00:29:28.588 } 00:29:28.588 ] 00:29:28.588 }' 00:29:28.588 13:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:28.588 13:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:29.524 13:55:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:29.783 [2024-07-12 13:55:18.114155] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:29.783 [2024-07-12 13:55:18.114185] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:29.783 [2024-07-12 13:55:18.114243] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:29.783 [2024-07-12 13:55:18.114301] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:29.783 [2024-07-12 13:55:18.114313] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2601cb0 name raid_bdev1, state offline 00:29:29.783 13:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:29.783 13:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:29:30.042 13:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:30.042 13:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:29:30.042 13:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:29:30.042 13:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:30.611 13:55:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:30.611 [2024-07-12 13:55:19.176919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:30.611 [2024-07-12 13:55:19.176971] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:30.611 [2024-07-12 13:55:19.176992] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2604070 00:29:30.611 [2024-07-12 13:55:19.177005] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:30.611 [2024-07-12 13:55:19.178475] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:30.611 [2024-07-12 13:55:19.178507] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:30.611 [2024-07-12 13:55:19.178565] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:30.611 [2024-07-12 13:55:19.178591] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:30.611 [2024-07-12 13:55:19.178680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:30.611 spare 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:30.870 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:30.870 [2024-07-12 13:55:19.278991] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2602150 00:29:30.870 [2024-07-12 13:55:19.279012] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:30.870 [2024-07-12 13:55:19.279104] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25fd3a0 00:29:30.870 [2024-07-12 13:55:19.279213] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2602150 00:29:30.870 [2024-07-12 13:55:19.279223] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2602150 00:29:30.870 [2024-07-12 13:55:19.279295] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:31.439 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:31.439 "name": "raid_bdev1", 00:29:31.439 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:31.439 "strip_size_kb": 0, 00:29:31.439 "state": "online", 00:29:31.439 "raid_level": "raid1", 00:29:31.439 "superblock": true, 00:29:31.439 "num_base_bdevs": 2, 00:29:31.439 "num_base_bdevs_discovered": 2, 00:29:31.439 "num_base_bdevs_operational": 2, 00:29:31.439 "base_bdevs_list": [ 00:29:31.439 { 00:29:31.439 "name": "spare", 00:29:31.439 "uuid": "474a23a4-ca97-5484-a4c9-3e9df148075c", 00:29:31.439 "is_configured": true, 00:29:31.439 "data_offset": 256, 00:29:31.439 "data_size": 7936 00:29:31.439 }, 00:29:31.439 { 00:29:31.439 "name": "BaseBdev2", 00:29:31.439 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:31.439 "is_configured": true, 00:29:31.439 "data_offset": 256, 00:29:31.439 "data_size": 7936 00:29:31.439 } 00:29:31.439 ] 00:29:31.439 }' 00:29:31.439 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:31.439 13:55:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:32.009 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:32.009 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:32.009 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:32.009 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:32.009 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:32.009 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:32.009 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:32.009 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:32.009 "name": "raid_bdev1", 00:29:32.009 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:32.009 "strip_size_kb": 0, 00:29:32.009 "state": "online", 00:29:32.009 "raid_level": "raid1", 00:29:32.009 "superblock": true, 00:29:32.009 "num_base_bdevs": 2, 00:29:32.009 "num_base_bdevs_discovered": 2, 00:29:32.009 "num_base_bdevs_operational": 2, 00:29:32.009 "base_bdevs_list": [ 00:29:32.009 { 00:29:32.009 "name": "spare", 00:29:32.009 "uuid": "474a23a4-ca97-5484-a4c9-3e9df148075c", 00:29:32.009 "is_configured": true, 00:29:32.009 "data_offset": 256, 00:29:32.009 "data_size": 7936 00:29:32.009 }, 00:29:32.009 { 00:29:32.009 "name": "BaseBdev2", 00:29:32.009 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:32.009 "is_configured": true, 00:29:32.009 "data_offset": 256, 00:29:32.009 "data_size": 7936 00:29:32.009 } 00:29:32.009 ] 00:29:32.009 }' 00:29:32.009 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:32.268 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:32.268 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:32.268 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:32.268 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:32.268 13:55:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:32.836 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:32.836 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:33.096 [2024-07-12 13:55:21.503282] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.096 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:33.355 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:33.355 "name": "raid_bdev1", 00:29:33.355 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:33.355 "strip_size_kb": 0, 00:29:33.355 "state": "online", 00:29:33.355 "raid_level": "raid1", 00:29:33.355 "superblock": true, 00:29:33.355 "num_base_bdevs": 2, 00:29:33.355 "num_base_bdevs_discovered": 1, 00:29:33.355 "num_base_bdevs_operational": 1, 00:29:33.355 "base_bdevs_list": [ 00:29:33.355 { 00:29:33.355 "name": null, 00:29:33.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:33.355 "is_configured": false, 00:29:33.355 "data_offset": 256, 00:29:33.355 "data_size": 7936 00:29:33.355 }, 00:29:33.355 { 00:29:33.355 "name": "BaseBdev2", 00:29:33.355 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:33.355 "is_configured": true, 00:29:33.355 "data_offset": 256, 00:29:33.355 "data_size": 7936 00:29:33.355 } 00:29:33.355 ] 00:29:33.355 }' 00:29:33.355 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:33.355 13:55:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:33.923 13:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:33.923 [2024-07-12 13:55:22.473858] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:33.923 [2024-07-12 13:55:22.474023] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:33.923 [2024-07-12 13:55:22.474039] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:33.923 [2024-07-12 13:55:22.474068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:33.923 [2024-07-12 13:55:22.477523] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26026d0 00:29:33.923 [2024-07-12 13:55:22.478959] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:33.923 13:55:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:35.300 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:35.300 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:35.300 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:35.300 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:35.300 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:35.300 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:35.300 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:35.300 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:35.300 "name": "raid_bdev1", 00:29:35.300 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:35.300 "strip_size_kb": 0, 00:29:35.300 "state": "online", 00:29:35.300 "raid_level": "raid1", 00:29:35.300 "superblock": true, 00:29:35.300 "num_base_bdevs": 2, 00:29:35.300 "num_base_bdevs_discovered": 2, 00:29:35.300 "num_base_bdevs_operational": 2, 00:29:35.300 "process": { 00:29:35.300 "type": "rebuild", 00:29:35.300 "target": "spare", 00:29:35.301 "progress": { 00:29:35.301 "blocks": 3072, 00:29:35.301 "percent": 38 00:29:35.301 } 00:29:35.301 }, 00:29:35.301 "base_bdevs_list": [ 00:29:35.301 { 00:29:35.301 "name": "spare", 00:29:35.301 "uuid": "474a23a4-ca97-5484-a4c9-3e9df148075c", 00:29:35.301 "is_configured": true, 00:29:35.301 "data_offset": 256, 00:29:35.301 "data_size": 7936 00:29:35.301 }, 00:29:35.301 { 00:29:35.301 "name": "BaseBdev2", 00:29:35.301 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:35.301 "is_configured": true, 00:29:35.301 "data_offset": 256, 00:29:35.301 "data_size": 7936 00:29:35.301 } 00:29:35.301 ] 00:29:35.301 }' 00:29:35.301 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:35.301 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:35.301 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:35.559 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:35.559 13:55:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:35.818 [2024-07-12 13:55:24.156239] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:35.818 [2024-07-12 13:55:24.192185] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:35.818 [2024-07-12 13:55:24.192228] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:35.818 [2024-07-12 13:55:24.192244] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:35.818 [2024-07-12 13:55:24.192252] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:35.818 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:35.819 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:36.386 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:36.386 "name": "raid_bdev1", 00:29:36.386 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:36.386 "strip_size_kb": 0, 00:29:36.386 "state": "online", 00:29:36.386 "raid_level": "raid1", 00:29:36.386 "superblock": true, 00:29:36.386 "num_base_bdevs": 2, 00:29:36.386 "num_base_bdevs_discovered": 1, 00:29:36.386 "num_base_bdevs_operational": 1, 00:29:36.386 "base_bdevs_list": [ 00:29:36.386 { 00:29:36.386 "name": null, 00:29:36.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:36.386 "is_configured": false, 00:29:36.386 "data_offset": 256, 00:29:36.386 "data_size": 7936 00:29:36.386 }, 00:29:36.386 { 00:29:36.386 "name": "BaseBdev2", 00:29:36.386 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:36.386 "is_configured": true, 00:29:36.386 "data_offset": 256, 00:29:36.386 "data_size": 7936 00:29:36.386 } 00:29:36.386 ] 00:29:36.386 }' 00:29:36.386 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:36.386 13:55:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:36.954 13:55:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:37.522 [2024-07-12 13:55:25.840891] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:37.522 [2024-07-12 13:55:25.840957] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:37.522 [2024-07-12 13:55:25.840981] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26015c0 00:29:37.522 [2024-07-12 13:55:25.840994] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:37.522 [2024-07-12 13:55:25.841219] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:37.522 [2024-07-12 13:55:25.841238] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:37.522 [2024-07-12 13:55:25.841301] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:37.522 [2024-07-12 13:55:25.841313] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:37.522 [2024-07-12 13:55:25.841326] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:37.522 [2024-07-12 13:55:25.841346] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:37.522 [2024-07-12 13:55:25.845370] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26019e0 00:29:37.522 spare 00:29:37.522 [2024-07-12 13:55:25.846771] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:37.522 13:55:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:38.461 13:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:38.461 13:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:38.461 13:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:38.461 13:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:38.461 13:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:38.461 13:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:38.461 13:55:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:39.030 13:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:39.030 "name": "raid_bdev1", 00:29:39.030 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:39.030 "strip_size_kb": 0, 00:29:39.030 "state": "online", 00:29:39.030 "raid_level": "raid1", 00:29:39.030 "superblock": true, 00:29:39.030 "num_base_bdevs": 2, 00:29:39.030 "num_base_bdevs_discovered": 2, 00:29:39.030 "num_base_bdevs_operational": 2, 00:29:39.030 "process": { 00:29:39.030 "type": "rebuild", 00:29:39.030 "target": "spare", 00:29:39.030 "progress": { 00:29:39.030 "blocks": 3840, 00:29:39.030 "percent": 48 00:29:39.030 } 00:29:39.030 }, 00:29:39.030 "base_bdevs_list": [ 00:29:39.030 { 00:29:39.030 "name": "spare", 00:29:39.030 "uuid": "474a23a4-ca97-5484-a4c9-3e9df148075c", 00:29:39.030 "is_configured": true, 00:29:39.030 "data_offset": 256, 00:29:39.030 "data_size": 7936 00:29:39.030 }, 00:29:39.030 { 00:29:39.030 "name": "BaseBdev2", 00:29:39.030 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:39.030 "is_configured": true, 00:29:39.030 "data_offset": 256, 00:29:39.030 "data_size": 7936 00:29:39.030 } 00:29:39.030 ] 00:29:39.030 }' 00:29:39.030 13:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:39.030 13:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:39.030 13:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:39.030 13:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:39.030 13:55:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:39.599 [2024-07-12 13:55:27.992748] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:39.599 [2024-07-12 13:55:28.064438] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:39.599 [2024-07-12 13:55:28.064485] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:39.599 [2024-07-12 13:55:28.064502] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:39.599 [2024-07-12 13:55:28.064510] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.599 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:40.168 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:40.168 "name": "raid_bdev1", 00:29:40.168 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:40.168 "strip_size_kb": 0, 00:29:40.168 "state": "online", 00:29:40.168 "raid_level": "raid1", 00:29:40.168 "superblock": true, 00:29:40.168 "num_base_bdevs": 2, 00:29:40.168 "num_base_bdevs_discovered": 1, 00:29:40.168 "num_base_bdevs_operational": 1, 00:29:40.168 "base_bdevs_list": [ 00:29:40.168 { 00:29:40.168 "name": null, 00:29:40.168 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:40.168 "is_configured": false, 00:29:40.168 "data_offset": 256, 00:29:40.168 "data_size": 7936 00:29:40.168 }, 00:29:40.168 { 00:29:40.168 "name": "BaseBdev2", 00:29:40.168 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:40.168 "is_configured": true, 00:29:40.168 "data_offset": 256, 00:29:40.168 "data_size": 7936 00:29:40.168 } 00:29:40.168 ] 00:29:40.168 }' 00:29:40.168 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:40.168 13:55:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:40.734 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:40.734 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:40.734 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:40.734 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:40.734 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:40.734 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.734 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:40.992 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:40.992 "name": "raid_bdev1", 00:29:40.992 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:40.992 "strip_size_kb": 0, 00:29:40.992 "state": "online", 00:29:40.992 "raid_level": "raid1", 00:29:40.992 "superblock": true, 00:29:40.992 "num_base_bdevs": 2, 00:29:40.993 "num_base_bdevs_discovered": 1, 00:29:40.993 "num_base_bdevs_operational": 1, 00:29:40.993 "base_bdevs_list": [ 00:29:40.993 { 00:29:40.993 "name": null, 00:29:40.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:40.993 "is_configured": false, 00:29:40.993 "data_offset": 256, 00:29:40.993 "data_size": 7936 00:29:40.993 }, 00:29:40.993 { 00:29:40.993 "name": "BaseBdev2", 00:29:40.993 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:40.993 "is_configured": true, 00:29:40.993 "data_offset": 256, 00:29:40.993 "data_size": 7936 00:29:40.993 } 00:29:40.993 ] 00:29:40.993 }' 00:29:40.993 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:40.993 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:40.993 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:41.251 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:41.251 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:41.251 13:55:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:41.509 [2024-07-12 13:55:30.053987] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:41.509 [2024-07-12 13:55:30.054049] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:41.509 [2024-07-12 13:55:30.054073] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2600e50 00:29:41.509 [2024-07-12 13:55:30.054087] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:41.509 [2024-07-12 13:55:30.054268] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:41.509 [2024-07-12 13:55:30.054285] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:41.509 [2024-07-12 13:55:30.054335] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:41.509 [2024-07-12 13:55:30.054347] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:41.509 [2024-07-12 13:55:30.054357] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:41.509 BaseBdev1 00:29:41.509 13:55:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.884 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:43.143 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:43.143 "name": "raid_bdev1", 00:29:43.143 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:43.143 "strip_size_kb": 0, 00:29:43.143 "state": "online", 00:29:43.143 "raid_level": "raid1", 00:29:43.143 "superblock": true, 00:29:43.143 "num_base_bdevs": 2, 00:29:43.143 "num_base_bdevs_discovered": 1, 00:29:43.143 "num_base_bdevs_operational": 1, 00:29:43.143 "base_bdevs_list": [ 00:29:43.143 { 00:29:43.143 "name": null, 00:29:43.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:43.143 "is_configured": false, 00:29:43.143 "data_offset": 256, 00:29:43.143 "data_size": 7936 00:29:43.143 }, 00:29:43.143 { 00:29:43.143 "name": "BaseBdev2", 00:29:43.143 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:43.143 "is_configured": true, 00:29:43.143 "data_offset": 256, 00:29:43.143 "data_size": 7936 00:29:43.143 } 00:29:43.143 ] 00:29:43.143 }' 00:29:43.143 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:43.143 13:55:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:43.710 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:43.710 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:43.710 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:43.710 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:43.710 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:43.710 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:43.710 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:43.969 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:43.969 "name": "raid_bdev1", 00:29:43.969 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:43.969 "strip_size_kb": 0, 00:29:43.969 "state": "online", 00:29:43.969 "raid_level": "raid1", 00:29:43.969 "superblock": true, 00:29:43.969 "num_base_bdevs": 2, 00:29:43.969 "num_base_bdevs_discovered": 1, 00:29:43.969 "num_base_bdevs_operational": 1, 00:29:43.969 "base_bdevs_list": [ 00:29:43.969 { 00:29:43.969 "name": null, 00:29:43.969 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:43.969 "is_configured": false, 00:29:43.969 "data_offset": 256, 00:29:43.969 "data_size": 7936 00:29:43.969 }, 00:29:43.969 { 00:29:43.969 "name": "BaseBdev2", 00:29:43.969 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:43.969 "is_configured": true, 00:29:43.969 "data_offset": 256, 00:29:43.969 "data_size": 7936 00:29:43.969 } 00:29:43.969 ] 00:29:43.969 }' 00:29:43.969 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:43.969 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:43.969 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:44.228 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:44.229 [2024-07-12 13:55:32.725072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:44.229 [2024-07-12 13:55:32.725199] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:44.229 [2024-07-12 13:55:32.725214] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:44.229 request: 00:29:44.229 { 00:29:44.229 "base_bdev": "BaseBdev1", 00:29:44.229 "raid_bdev": "raid_bdev1", 00:29:44.229 "method": "bdev_raid_add_base_bdev", 00:29:44.229 "req_id": 1 00:29:44.229 } 00:29:44.229 Got JSON-RPC error response 00:29:44.229 response: 00:29:44.229 { 00:29:44.229 "code": -22, 00:29:44.229 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:44.229 } 00:29:44.229 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:44.229 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:44.229 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:44.229 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:44.229 13:55:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.606 "name": "raid_bdev1", 00:29:45.606 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:45.606 "strip_size_kb": 0, 00:29:45.606 "state": "online", 00:29:45.606 "raid_level": "raid1", 00:29:45.606 "superblock": true, 00:29:45.606 "num_base_bdevs": 2, 00:29:45.606 "num_base_bdevs_discovered": 1, 00:29:45.606 "num_base_bdevs_operational": 1, 00:29:45.606 "base_bdevs_list": [ 00:29:45.606 { 00:29:45.606 "name": null, 00:29:45.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.606 "is_configured": false, 00:29:45.606 "data_offset": 256, 00:29:45.606 "data_size": 7936 00:29:45.606 }, 00:29:45.606 { 00:29:45.606 "name": "BaseBdev2", 00:29:45.606 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:45.606 "is_configured": true, 00:29:45.606 "data_offset": 256, 00:29:45.606 "data_size": 7936 00:29:45.606 } 00:29:45.606 ] 00:29:45.606 }' 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.606 13:55:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:46.173 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:46.173 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:46.173 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:46.173 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:46.173 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:46.173 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:46.173 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:46.433 "name": "raid_bdev1", 00:29:46.433 "uuid": "896d6b6c-0996-46da-a9e5-ae0db6b3b926", 00:29:46.433 "strip_size_kb": 0, 00:29:46.433 "state": "online", 00:29:46.433 "raid_level": "raid1", 00:29:46.433 "superblock": true, 00:29:46.433 "num_base_bdevs": 2, 00:29:46.433 "num_base_bdevs_discovered": 1, 00:29:46.433 "num_base_bdevs_operational": 1, 00:29:46.433 "base_bdevs_list": [ 00:29:46.433 { 00:29:46.433 "name": null, 00:29:46.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:46.433 "is_configured": false, 00:29:46.433 "data_offset": 256, 00:29:46.433 "data_size": 7936 00:29:46.433 }, 00:29:46.433 { 00:29:46.433 "name": "BaseBdev2", 00:29:46.433 "uuid": "ce3b0f00-b418-52d1-b009-a75a0ebc9692", 00:29:46.433 "is_configured": true, 00:29:46.433 "data_offset": 256, 00:29:46.433 "data_size": 7936 00:29:46.433 } 00:29:46.433 ] 00:29:46.433 }' 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 593074 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 593074 ']' 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 593074 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 593074 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 593074' 00:29:46.433 killing process with pid 593074 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 593074 00:29:46.433 Received shutdown signal, test time was about 60.000000 seconds 00:29:46.433 00:29:46.433 Latency(us) 00:29:46.433 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:46.433 =================================================================================================================== 00:29:46.433 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:46.433 [2024-07-12 13:55:34.961435] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:46.433 [2024-07-12 13:55:34.961523] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:46.433 [2024-07-12 13:55:34.961570] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:46.433 [2024-07-12 13:55:34.961582] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2602150 name raid_bdev1, state offline 00:29:46.433 13:55:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 593074 00:29:46.433 [2024-07-12 13:55:34.991909] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:46.692 13:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:29:46.692 00:29:46.693 real 0m31.145s 00:29:46.693 user 0m51.275s 00:29:46.693 sys 0m4.201s 00:29:46.693 13:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:46.693 13:55:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:46.693 ************************************ 00:29:46.693 END TEST raid_rebuild_test_sb_md_interleaved 00:29:46.693 ************************************ 00:29:46.693 13:55:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:46.693 13:55:35 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:29:46.693 13:55:35 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:29:46.693 13:55:35 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 593074 ']' 00:29:46.693 13:55:35 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 593074 00:29:46.952 13:55:35 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:29:46.952 00:29:46.952 real 19m28.631s 00:29:46.952 user 33m12.115s 00:29:46.952 sys 3m31.737s 00:29:46.952 13:55:35 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:46.952 13:55:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:46.952 ************************************ 00:29:46.952 END TEST bdev_raid 00:29:46.952 ************************************ 00:29:46.952 13:55:35 -- common/autotest_common.sh@1142 -- # return 0 00:29:46.952 13:55:35 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:46.952 13:55:35 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:46.952 13:55:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:46.952 13:55:35 -- common/autotest_common.sh@10 -- # set +x 00:29:46.952 ************************************ 00:29:46.952 START TEST bdevperf_config 00:29:46.952 ************************************ 00:29:46.952 13:55:35 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:46.952 * Looking for test storage... 00:29:46.952 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:46.952 00:29:46.952 13:55:35 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:47.212 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:47.212 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:47.212 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:47.212 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:47.212 13:55:35 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:49.748 13:55:38 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-12 13:55:35.618449] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:49.748 [2024-07-12 13:55:35.618524] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid597562 ] 00:29:49.748 Using job config with 4 jobs 00:29:49.748 [2024-07-12 13:55:35.760073] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:49.748 [2024-07-12 13:55:35.878918] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:49.748 cpumask for '\''job0'\'' is too big 00:29:49.748 cpumask for '\''job1'\'' is too big 00:29:49.748 cpumask for '\''job2'\'' is too big 00:29:49.748 cpumask for '\''job3'\'' is too big 00:29:49.748 Running I/O for 2 seconds... 00:29:49.748 00:29:49.748 Latency(us) 00:29:49.748 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.02 24069.52 23.51 0.00 0.00 10627.46 1837.86 16298.52 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.02 24046.90 23.48 0.00 0.00 10613.48 1837.86 14474.91 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.02 24024.44 23.46 0.00 0.00 10599.52 1837.86 12594.31 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.03 24002.04 23.44 0.00 0.00 10586.26 1837.86 10884.67 00:29:49.748 =================================================================================================================== 00:29:49.748 Total : 96142.90 93.89 0.00 0.00 10606.68 1837.86 16298.52' 00:29:49.748 13:55:38 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-12 13:55:35.618449] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:49.748 [2024-07-12 13:55:35.618524] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid597562 ] 00:29:49.748 Using job config with 4 jobs 00:29:49.748 [2024-07-12 13:55:35.760073] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:49.748 [2024-07-12 13:55:35.878918] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:49.748 cpumask for '\''job0'\'' is too big 00:29:49.748 cpumask for '\''job1'\'' is too big 00:29:49.748 cpumask for '\''job2'\'' is too big 00:29:49.748 cpumask for '\''job3'\'' is too big 00:29:49.748 Running I/O for 2 seconds... 00:29:49.748 00:29:49.748 Latency(us) 00:29:49.748 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.02 24069.52 23.51 0.00 0.00 10627.46 1837.86 16298.52 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.02 24046.90 23.48 0.00 0.00 10613.48 1837.86 14474.91 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.02 24024.44 23.46 0.00 0.00 10599.52 1837.86 12594.31 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.03 24002.04 23.44 0.00 0.00 10586.26 1837.86 10884.67 00:29:49.748 =================================================================================================================== 00:29:49.748 Total : 96142.90 93.89 0.00 0.00 10606.68 1837.86 16298.52' 00:29:49.748 13:55:38 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 13:55:35.618449] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:49.748 [2024-07-12 13:55:35.618524] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid597562 ] 00:29:49.748 Using job config with 4 jobs 00:29:49.748 [2024-07-12 13:55:35.760073] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:49.748 [2024-07-12 13:55:35.878918] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:49.748 cpumask for '\''job0'\'' is too big 00:29:49.748 cpumask for '\''job1'\'' is too big 00:29:49.748 cpumask for '\''job2'\'' is too big 00:29:49.748 cpumask for '\''job3'\'' is too big 00:29:49.748 Running I/O for 2 seconds... 00:29:49.748 00:29:49.748 Latency(us) 00:29:49.748 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.02 24069.52 23.51 0.00 0.00 10627.46 1837.86 16298.52 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.02 24046.90 23.48 0.00 0.00 10613.48 1837.86 14474.91 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.02 24024.44 23.46 0.00 0.00 10599.52 1837.86 12594.31 00:29:49.748 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:49.748 Malloc0 : 2.03 24002.04 23.44 0.00 0.00 10586.26 1837.86 10884.67 00:29:49.748 =================================================================================================================== 00:29:49.748 Total : 96142.90 93.89 0.00 0.00 10606.68 1837.86 16298.52' 00:29:49.748 13:55:38 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:49.748 13:55:38 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:50.008 13:55:38 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:29:50.008 13:55:38 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:50.008 [2024-07-12 13:55:38.395151] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:50.008 [2024-07-12 13:55:38.395226] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid597921 ] 00:29:50.008 [2024-07-12 13:55:38.540399] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:50.267 [2024-07-12 13:55:38.658042] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:50.267 cpumask for 'job0' is too big 00:29:50.267 cpumask for 'job1' is too big 00:29:50.267 cpumask for 'job2' is too big 00:29:50.267 cpumask for 'job3' is too big 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:29:52.805 Running I/O for 2 seconds... 00:29:52.805 00:29:52.805 Latency(us) 00:29:52.805 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:52.805 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:52.805 Malloc0 : 2.02 23980.85 23.42 0.00 0.00 10662.79 1852.10 16298.52 00:29:52.805 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:52.805 Malloc0 : 2.02 23958.84 23.40 0.00 0.00 10648.52 1837.86 14417.92 00:29:52.805 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:52.805 Malloc0 : 2.02 23937.00 23.38 0.00 0.00 10635.04 1837.86 12594.31 00:29:52.805 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:52.805 Malloc0 : 2.02 23915.22 23.35 0.00 0.00 10620.00 1837.86 12537.32 00:29:52.805 =================================================================================================================== 00:29:52.805 Total : 95791.91 93.55 0.00 0.00 10641.59 1837.86 16298.52' 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.805 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.805 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:52.805 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:52.805 13:55:41 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-12 13:55:41.169814] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:55.448 [2024-07-12 13:55:41.169881] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid598273 ] 00:29:55.448 Using job config with 3 jobs 00:29:55.448 [2024-07-12 13:55:41.308537] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.448 [2024-07-12 13:55:41.425512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.448 cpumask for '\''job0'\'' is too big 00:29:55.448 cpumask for '\''job1'\'' is too big 00:29:55.448 cpumask for '\''job2'\'' is too big 00:29:55.448 Running I/O for 2 seconds... 00:29:55.448 00:29:55.448 Latency(us) 00:29:55.448 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:55.448 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.448 Malloc0 : 2.02 32645.14 31.88 0.00 0.00 7829.92 1795.12 11454.55 00:29:55.448 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.448 Malloc0 : 2.02 32615.07 31.85 0.00 0.00 7819.97 1780.87 9687.93 00:29:55.448 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.448 Malloc0 : 2.02 32584.15 31.82 0.00 0.00 7810.45 1773.75 8092.27 00:29:55.448 =================================================================================================================== 00:29:55.448 Total : 97844.36 95.55 0.00 0.00 7820.11 1773.75 11454.55' 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-12 13:55:41.169814] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:55.448 [2024-07-12 13:55:41.169881] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid598273 ] 00:29:55.448 Using job config with 3 jobs 00:29:55.448 [2024-07-12 13:55:41.308537] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.448 [2024-07-12 13:55:41.425512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.448 cpumask for '\''job0'\'' is too big 00:29:55.448 cpumask for '\''job1'\'' is too big 00:29:55.448 cpumask for '\''job2'\'' is too big 00:29:55.448 Running I/O for 2 seconds... 00:29:55.448 00:29:55.448 Latency(us) 00:29:55.448 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:55.448 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.448 Malloc0 : 2.02 32645.14 31.88 0.00 0.00 7829.92 1795.12 11454.55 00:29:55.448 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.448 Malloc0 : 2.02 32615.07 31.85 0.00 0.00 7819.97 1780.87 9687.93 00:29:55.448 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.448 Malloc0 : 2.02 32584.15 31.82 0.00 0.00 7810.45 1773.75 8092.27 00:29:55.448 =================================================================================================================== 00:29:55.448 Total : 97844.36 95.55 0.00 0.00 7820.11 1773.75 11454.55' 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 13:55:41.169814] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:55.448 [2024-07-12 13:55:41.169881] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid598273 ] 00:29:55.448 Using job config with 3 jobs 00:29:55.448 [2024-07-12 13:55:41.308537] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.448 [2024-07-12 13:55:41.425512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.448 cpumask for '\''job0'\'' is too big 00:29:55.448 cpumask for '\''job1'\'' is too big 00:29:55.448 cpumask for '\''job2'\'' is too big 00:29:55.448 Running I/O for 2 seconds... 00:29:55.448 00:29:55.448 Latency(us) 00:29:55.448 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:55.448 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.448 Malloc0 : 2.02 32645.14 31.88 0.00 0.00 7829.92 1795.12 11454.55 00:29:55.448 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.448 Malloc0 : 2.02 32615.07 31.85 0.00 0.00 7819.97 1780.87 9687.93 00:29:55.448 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:55.448 Malloc0 : 2.02 32584.15 31.82 0.00 0.00 7810.45 1773.75 8092.27 00:29:55.448 =================================================================================================================== 00:29:55.448 Total : 97844.36 95.55 0.00 0.00 7820.11 1773.75 11454.55' 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:55.448 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:55.448 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:55.448 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:55.448 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:55.448 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:55.448 13:55:43 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:58.739 13:55:46 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-12 13:55:43.956008] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:58.739 [2024-07-12 13:55:43.956064] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid598636 ] 00:29:58.739 Using job config with 4 jobs 00:29:58.739 [2024-07-12 13:55:44.082646] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:58.739 [2024-07-12 13:55:44.206021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:58.739 cpumask for '\''job0'\'' is too big 00:29:58.739 cpumask for '\''job1'\'' is too big 00:29:58.739 cpumask for '\''job2'\'' is too big 00:29:58.739 cpumask for '\''job3'\'' is too big 00:29:58.739 Running I/O for 2 seconds... 00:29:58.739 00:29:58.739 Latency(us) 00:29:58.739 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:58.739 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc0 : 2.02 12010.81 11.73 0.00 0.00 21297.12 3789.69 33052.94 00:29:58.739 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc1 : 2.03 11999.49 11.72 0.00 0.00 21296.80 4644.51 32824.99 00:29:58.739 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc0 : 2.04 12019.65 11.74 0.00 0.00 21185.01 3732.70 29063.79 00:29:58.739 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc1 : 2.05 12008.51 11.73 0.00 0.00 21185.33 4587.52 29063.79 00:29:58.739 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc0 : 2.05 11997.69 11.72 0.00 0.00 21129.02 3732.70 25302.59 00:29:58.739 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc1 : 2.05 11986.60 11.71 0.00 0.00 21127.73 4587.52 25302.59 00:29:58.739 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc0 : 2.05 11975.84 11.70 0.00 0.00 21072.32 3761.20 21655.37 00:29:58.739 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc1 : 2.05 11964.55 11.68 0.00 0.00 21072.18 4587.52 21655.37 00:29:58.739 =================================================================================================================== 00:29:58.739 Total : 95963.14 93.71 0.00 0.00 21170.36 3732.70 33052.94' 00:29:58.739 13:55:46 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-12 13:55:43.956008] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:58.739 [2024-07-12 13:55:43.956064] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid598636 ] 00:29:58.739 Using job config with 4 jobs 00:29:58.739 [2024-07-12 13:55:44.082646] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:58.739 [2024-07-12 13:55:44.206021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:58.739 cpumask for '\''job0'\'' is too big 00:29:58.739 cpumask for '\''job1'\'' is too big 00:29:58.739 cpumask for '\''job2'\'' is too big 00:29:58.739 cpumask for '\''job3'\'' is too big 00:29:58.739 Running I/O for 2 seconds... 00:29:58.739 00:29:58.739 Latency(us) 00:29:58.739 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:58.739 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc0 : 2.02 12010.81 11.73 0.00 0.00 21297.12 3789.69 33052.94 00:29:58.739 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc1 : 2.03 11999.49 11.72 0.00 0.00 21296.80 4644.51 32824.99 00:29:58.739 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc0 : 2.04 12019.65 11.74 0.00 0.00 21185.01 3732.70 29063.79 00:29:58.739 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc1 : 2.05 12008.51 11.73 0.00 0.00 21185.33 4587.52 29063.79 00:29:58.739 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc0 : 2.05 11997.69 11.72 0.00 0.00 21129.02 3732.70 25302.59 00:29:58.739 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc1 : 2.05 11986.60 11.71 0.00 0.00 21127.73 4587.52 25302.59 00:29:58.739 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc0 : 2.05 11975.84 11.70 0.00 0.00 21072.32 3761.20 21655.37 00:29:58.739 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.739 Malloc1 : 2.05 11964.55 11.68 0.00 0.00 21072.18 4587.52 21655.37 00:29:58.739 =================================================================================================================== 00:29:58.739 Total : 95963.14 93.71 0.00 0.00 21170.36 3732.70 33052.94' 00:29:58.739 13:55:46 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 13:55:43.956008] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:58.739 [2024-07-12 13:55:43.956064] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid598636 ] 00:29:58.739 Using job config with 4 jobs 00:29:58.739 [2024-07-12 13:55:44.082646] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:58.739 [2024-07-12 13:55:44.206021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:58.739 cpumask for '\''job0'\'' is too big 00:29:58.739 cpumask for '\''job1'\'' is too big 00:29:58.739 cpumask for '\''job2'\'' is too big 00:29:58.739 cpumask for '\''job3'\'' is too big 00:29:58.739 Running I/O for 2 seconds... 00:29:58.739 00:29:58.739 Latency(us) 00:29:58.740 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:58.740 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.740 Malloc0 : 2.02 12010.81 11.73 0.00 0.00 21297.12 3789.69 33052.94 00:29:58.740 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.740 Malloc1 : 2.03 11999.49 11.72 0.00 0.00 21296.80 4644.51 32824.99 00:29:58.740 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.740 Malloc0 : 2.04 12019.65 11.74 0.00 0.00 21185.01 3732.70 29063.79 00:29:58.740 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.740 Malloc1 : 2.05 12008.51 11.73 0.00 0.00 21185.33 4587.52 29063.79 00:29:58.740 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.740 Malloc0 : 2.05 11997.69 11.72 0.00 0.00 21129.02 3732.70 25302.59 00:29:58.740 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.740 Malloc1 : 2.05 11986.60 11.71 0.00 0.00 21127.73 4587.52 25302.59 00:29:58.740 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.740 Malloc0 : 2.05 11975.84 11.70 0.00 0.00 21072.32 3761.20 21655.37 00:29:58.740 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:58.740 Malloc1 : 2.05 11964.55 11.68 0.00 0.00 21072.18 4587.52 21655.37 00:29:58.740 =================================================================================================================== 00:29:58.740 Total : 95963.14 93.71 0.00 0.00 21170.36 3732.70 33052.94' 00:29:58.740 13:55:46 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:58.740 13:55:46 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:58.740 13:55:46 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:29:58.740 13:55:46 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:29:58.740 13:55:46 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:58.740 13:55:46 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:29:58.740 00:29:58.740 real 0m11.258s 00:29:58.740 user 0m9.922s 00:29:58.740 sys 0m1.193s 00:29:58.740 13:55:46 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:58.740 13:55:46 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:29:58.740 ************************************ 00:29:58.740 END TEST bdevperf_config 00:29:58.740 ************************************ 00:29:58.740 13:55:46 -- common/autotest_common.sh@1142 -- # return 0 00:29:58.740 13:55:46 -- spdk/autotest.sh@192 -- # uname -s 00:29:58.740 13:55:46 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:29:58.740 13:55:46 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:58.740 13:55:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:58.740 13:55:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:58.740 13:55:46 -- common/autotest_common.sh@10 -- # set +x 00:29:58.740 ************************************ 00:29:58.740 START TEST reactor_set_interrupt 00:29:58.740 ************************************ 00:29:58.740 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:58.740 * Looking for test storage... 00:29:58.740 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.740 13:55:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:29:58.740 13:55:46 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:58.740 13:55:46 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.740 13:55:46 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.740 13:55:46 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:29:58.740 13:55:46 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:58.740 13:55:46 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:29:58.740 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:29:58.740 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:29:58.740 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:29:58.740 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:29:58.740 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:29:58.740 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:29:58.740 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:29:58.740 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:29:58.740 13:55:46 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:29:58.741 13:55:46 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:29:58.741 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:29:58.741 13:55:46 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:29:58.741 #define SPDK_CONFIG_H 00:29:58.741 #define SPDK_CONFIG_APPS 1 00:29:58.741 #define SPDK_CONFIG_ARCH native 00:29:58.741 #undef SPDK_CONFIG_ASAN 00:29:58.741 #undef SPDK_CONFIG_AVAHI 00:29:58.741 #undef SPDK_CONFIG_CET 00:29:58.741 #define SPDK_CONFIG_COVERAGE 1 00:29:58.741 #define SPDK_CONFIG_CROSS_PREFIX 00:29:58.741 #define SPDK_CONFIG_CRYPTO 1 00:29:58.741 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:29:58.741 #undef SPDK_CONFIG_CUSTOMOCF 00:29:58.741 #undef SPDK_CONFIG_DAOS 00:29:58.741 #define SPDK_CONFIG_DAOS_DIR 00:29:58.741 #define SPDK_CONFIG_DEBUG 1 00:29:58.741 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:29:58.741 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:58.741 #define SPDK_CONFIG_DPDK_INC_DIR 00:29:58.741 #define SPDK_CONFIG_DPDK_LIB_DIR 00:29:58.741 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:29:58.741 #undef SPDK_CONFIG_DPDK_UADK 00:29:58.741 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:58.741 #define SPDK_CONFIG_EXAMPLES 1 00:29:58.741 #undef SPDK_CONFIG_FC 00:29:58.741 #define SPDK_CONFIG_FC_PATH 00:29:58.741 #define SPDK_CONFIG_FIO_PLUGIN 1 00:29:58.741 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:29:58.741 #undef SPDK_CONFIG_FUSE 00:29:58.741 #undef SPDK_CONFIG_FUZZER 00:29:58.741 #define SPDK_CONFIG_FUZZER_LIB 00:29:58.741 #undef SPDK_CONFIG_GOLANG 00:29:58.741 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:29:58.741 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:29:58.741 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:29:58.741 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:29:58.741 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:29:58.741 #undef SPDK_CONFIG_HAVE_LIBBSD 00:29:58.741 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:29:58.741 #define SPDK_CONFIG_IDXD 1 00:29:58.741 #define SPDK_CONFIG_IDXD_KERNEL 1 00:29:58.741 #define SPDK_CONFIG_IPSEC_MB 1 00:29:58.741 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:58.742 #define SPDK_CONFIG_ISAL 1 00:29:58.742 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:29:58.742 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:29:58.742 #define SPDK_CONFIG_LIBDIR 00:29:58.742 #undef SPDK_CONFIG_LTO 00:29:58.742 #define SPDK_CONFIG_MAX_LCORES 128 00:29:58.742 #define SPDK_CONFIG_NVME_CUSE 1 00:29:58.742 #undef SPDK_CONFIG_OCF 00:29:58.742 #define SPDK_CONFIG_OCF_PATH 00:29:58.742 #define SPDK_CONFIG_OPENSSL_PATH 00:29:58.742 #undef SPDK_CONFIG_PGO_CAPTURE 00:29:58.742 #define SPDK_CONFIG_PGO_DIR 00:29:58.742 #undef SPDK_CONFIG_PGO_USE 00:29:58.742 #define SPDK_CONFIG_PREFIX /usr/local 00:29:58.742 #undef SPDK_CONFIG_RAID5F 00:29:58.742 #undef SPDK_CONFIG_RBD 00:29:58.742 #define SPDK_CONFIG_RDMA 1 00:29:58.742 #define SPDK_CONFIG_RDMA_PROV verbs 00:29:58.742 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:29:58.742 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:29:58.742 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:29:58.742 #define SPDK_CONFIG_SHARED 1 00:29:58.742 #undef SPDK_CONFIG_SMA 00:29:58.742 #define SPDK_CONFIG_TESTS 1 00:29:58.742 #undef SPDK_CONFIG_TSAN 00:29:58.742 #define SPDK_CONFIG_UBLK 1 00:29:58.742 #define SPDK_CONFIG_UBSAN 1 00:29:58.742 #undef SPDK_CONFIG_UNIT_TESTS 00:29:58.742 #undef SPDK_CONFIG_URING 00:29:58.742 #define SPDK_CONFIG_URING_PATH 00:29:58.742 #undef SPDK_CONFIG_URING_ZNS 00:29:58.742 #undef SPDK_CONFIG_USDT 00:29:58.742 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:29:58.742 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:29:58.742 #undef SPDK_CONFIG_VFIO_USER 00:29:58.742 #define SPDK_CONFIG_VFIO_USER_DIR 00:29:58.742 #define SPDK_CONFIG_VHOST 1 00:29:58.742 #define SPDK_CONFIG_VIRTIO 1 00:29:58.742 #undef SPDK_CONFIG_VTUNE 00:29:58.742 #define SPDK_CONFIG_VTUNE_DIR 00:29:58.742 #define SPDK_CONFIG_WERROR 1 00:29:58.742 #define SPDK_CONFIG_WPDK_DIR 00:29:58.742 #undef SPDK_CONFIG_XNVME 00:29:58.742 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:29:58.742 13:55:46 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:58.742 13:55:46 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:58.742 13:55:46 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:58.742 13:55:46 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:58.742 13:55:46 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:58.742 13:55:46 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:58.742 13:55:46 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:58.742 13:55:46 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:29:58.742 13:55:46 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:29:58.742 13:55:46 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:29:58.742 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:58.743 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:29:58.744 13:55:46 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 599027 ]] 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 599027 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@1707 -- # set_test_storage 2147483648 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.5iyPiw 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.5iyPiw/tests/interrupt /tmp/spdk.5iyPiw 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:29:58.744 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88992903168 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5515612160 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47250882560 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=3375104 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892349440 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9355264 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253831680 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=425984 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:58.745 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:29:58.745 * Looking for test storage... 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88992903168 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7730204672 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.746 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@1709 -- # set -o errtrace 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@1710 -- # shopt -s extdebug 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@1711 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@1713 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@1714 -- # true 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@1716 -- # xtrace_fd 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=599074 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:58.746 13:55:47 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 599074 /var/tmp/spdk.sock 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 599074 ']' 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:58.746 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:58.746 13:55:47 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:58.746 [2024-07-12 13:55:47.119795] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:29:58.746 [2024-07-12 13:55:47.119862] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid599074 ] 00:29:58.746 [2024-07-12 13:55:47.249975] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:59.005 [2024-07-12 13:55:47.357363] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:59.005 [2024-07-12 13:55:47.357463] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:59.005 [2024-07-12 13:55:47.357464] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:59.005 [2024-07-12 13:55:47.438394] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:59.573 13:55:48 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:59.573 13:55:48 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:29:59.573 13:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:29:59.573 13:55:48 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:59.832 Malloc0 00:29:59.832 Malloc1 00:29:59.832 Malloc2 00:29:59.832 13:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:29:59.832 13:55:48 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:29:59.832 13:55:48 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:59.832 13:55:48 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:59.832 5000+0 records in 00:29:59.832 5000+0 records out 00:29:59.832 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0255542 s, 401 MB/s 00:29:59.832 13:55:48 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:00.091 AIO0 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 599074 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 599074 without_thd 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=599074 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:00.091 13:55:48 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:00.350 13:55:48 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:00.350 13:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:00.350 13:55:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:00.350 13:55:48 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:00.350 13:55:48 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:00.350 13:55:48 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:00.350 13:55:48 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:00.350 13:55:48 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:00.350 13:55:48 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:00.609 spdk_thread ids are 1 on reactor0. 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 599074 0 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 599074 0 idle 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599074 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599074 -w 256 00:30:00.609 13:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599074 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0' 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599074 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 599074 1 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 599074 1 idle 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599074 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599074 -w 256 00:30:00.867 13:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599083 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599083 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 599074 2 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 599074 2 idle 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599074 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:01.125 13:55:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:01.126 13:55:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:01.126 13:55:49 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:01.126 13:55:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:01.126 13:55:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:01.126 13:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599074 -w 256 00:30:01.126 13:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599084 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599084 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:30:01.384 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:30:01.384 [2024-07-12 13:55:49.962454] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:01.643 13:55:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:01.901 [2024-07-12 13:55:50.234132] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:01.902 [2024-07-12 13:55:50.234586] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:01.902 13:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:02.161 [2024-07-12 13:55:50.502089] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:02.161 [2024-07-12 13:55:50.502464] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 599074 0 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 599074 0 busy 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599074 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599074 -w 256 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599074 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.87 reactor_0' 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599074 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.87 reactor_0 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 599074 2 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 599074 2 busy 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599074 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599074 -w 256 00:30:02.161 13:55:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599084 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2' 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599084 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:02.420 13:55:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:02.679 [2024-07-12 13:55:51.118069] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:02.679 [2024-07-12 13:55:51.118183] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 599074 2 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 599074 2 idle 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599074 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599074 -w 256 00:30:02.679 13:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599084 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.61 reactor_2' 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599084 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.61 reactor_2 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:02.938 13:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:03.197 [2024-07-12 13:55:51.566081] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:03.197 [2024-07-12 13:55:51.566257] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:03.197 13:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:30:03.197 13:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:30:03.197 13:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:30:03.457 [2024-07-12 13:55:51.830485] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 599074 0 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 599074 0 idle 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599074 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599074 -w 256 00:30:03.457 13:55:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599074 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.74 reactor_0' 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599074 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.74 reactor_0 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:30:03.457 13:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 599074 00:30:03.457 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 599074 ']' 00:30:03.457 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 599074 00:30:03.716 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:03.716 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:03.716 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 599074 00:30:03.716 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:03.716 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:03.716 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 599074' 00:30:03.716 killing process with pid 599074 00:30:03.716 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 599074 00:30:03.716 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 599074 00:30:03.976 13:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:30:03.976 13:55:52 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:03.976 13:55:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:30:03.976 13:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:03.976 13:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:03.976 13:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=599847 00:30:03.976 13:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:03.976 13:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:03.976 13:55:52 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 599847 /var/tmp/spdk.sock 00:30:03.976 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 599847 ']' 00:30:03.976 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:03.976 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:03.976 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:03.976 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:03.976 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:03.976 13:55:52 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:03.976 [2024-07-12 13:55:52.402526] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:30:03.976 [2024-07-12 13:55:52.402633] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid599847 ] 00:30:03.976 [2024-07-12 13:55:52.545857] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:04.235 [2024-07-12 13:55:52.649822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:04.235 [2024-07-12 13:55:52.649867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:04.235 [2024-07-12 13:55:52.649869] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:04.235 [2024-07-12 13:55:52.721153] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:04.804 13:55:53 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:04.804 13:55:53 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:04.804 13:55:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:30:04.804 13:55:53 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:05.063 Malloc0 00:30:05.063 Malloc1 00:30:05.063 Malloc2 00:30:05.322 13:55:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:30:05.322 13:55:53 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:05.322 13:55:53 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:05.322 13:55:53 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:05.322 5000+0 records in 00:30:05.322 5000+0 records out 00:30:05.322 10240000 bytes (10 MB, 9.8 MiB) copied, 0.028066 s, 365 MB/s 00:30:05.322 13:55:53 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:05.581 AIO0 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 599847 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 599847 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=599847 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:05.581 13:55:53 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:05.840 13:55:54 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:05.840 13:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:05.840 13:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:05.840 13:55:54 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:05.840 13:55:54 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:05.840 13:55:54 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:05.840 13:55:54 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:05.840 13:55:54 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:05.840 13:55:54 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:06.099 13:55:54 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:06.099 13:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:06.099 13:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:06.099 spdk_thread ids are 1 on reactor0. 00:30:06.099 13:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:06.099 13:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 599847 0 00:30:06.099 13:55:54 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 599847 0 idle 00:30:06.099 13:55:54 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599847 00:30:06.099 13:55:54 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:06.099 13:55:54 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599847 -w 256 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599847 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0' 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599847 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 599847 1 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 599847 1 idle 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599847 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:06.100 13:55:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599847 -w 256 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599850 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599850 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 599847 2 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 599847 2 idle 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599847 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:06.359 13:55:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599847 -w 256 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599851 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599851 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:30:06.618 13:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:06.877 [2024-07-12 13:55:55.266625] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:06.877 [2024-07-12 13:55:55.266816] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:30:06.877 [2024-07-12 13:55:55.267063] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:06.877 13:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:07.135 [2024-07-12 13:55:55.519180] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:07.135 [2024-07-12 13:55:55.519394] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 599847 0 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 599847 0 busy 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599847 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599847 -w 256 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599847 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.85 reactor_0' 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599847 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.85 reactor_0 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 599847 2 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 599847 2 busy 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599847 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:07.135 13:55:55 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599847 -w 256 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599851 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.35 reactor_2' 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599851 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.35 reactor_2 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:07.394 13:55:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:07.653 [2024-07-12 13:55:56.120903] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:07.653 [2024-07-12 13:55:56.121025] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:07.653 13:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:30:07.653 13:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 599847 2 00:30:07.653 13:55:56 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 599847 2 idle 00:30:07.653 13:55:56 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599847 00:30:07.653 13:55:56 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:07.654 13:55:56 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:07.654 13:55:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:07.654 13:55:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:07.654 13:55:56 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:07.654 13:55:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:07.654 13:55:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:07.654 13:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599847 -w 256 00:30:07.654 13:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:07.912 13:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599851 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.59 reactor_2' 00:30:07.912 13:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599851 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.59 reactor_2 00:30:07.912 13:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:07.912 13:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:07.912 13:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:07.912 13:55:56 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:07.912 13:55:56 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:07.912 13:55:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:07.912 13:55:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:07.912 13:55:56 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:07.913 13:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:08.172 [2024-07-12 13:55:56.550002] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:08.172 [2024-07-12 13:55:56.550204] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:30:08.172 [2024-07-12 13:55:56.550227] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 599847 0 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 599847 0 idle 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=599847 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 599847 -w 256 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 599847 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.70 reactor_0' 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 599847 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.70 reactor_0 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:30:08.172 13:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:30:08.430 13:55:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 599847 00:30:08.430 13:55:56 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 599847 ']' 00:30:08.430 13:55:56 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 599847 00:30:08.430 13:55:56 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:08.430 13:55:56 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:08.430 13:55:56 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 599847 00:30:08.430 13:55:56 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:08.430 13:55:56 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:08.430 13:55:56 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 599847' 00:30:08.430 killing process with pid 599847 00:30:08.430 13:55:56 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 599847 00:30:08.430 13:55:56 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 599847 00:30:08.689 13:55:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:30:08.689 13:55:57 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:08.689 00:30:08.689 real 0m10.296s 00:30:08.689 user 0m9.727s 00:30:08.689 sys 0m2.223s 00:30:08.689 13:55:57 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:08.689 13:55:57 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:08.689 ************************************ 00:30:08.689 END TEST reactor_set_interrupt 00:30:08.689 ************************************ 00:30:08.689 13:55:57 -- common/autotest_common.sh@1142 -- # return 0 00:30:08.689 13:55:57 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:08.689 13:55:57 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:08.689 13:55:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:08.689 13:55:57 -- common/autotest_common.sh@10 -- # set +x 00:30:08.689 ************************************ 00:30:08.689 START TEST reap_unregistered_poller 00:30:08.689 ************************************ 00:30:08.689 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:08.689 * Looking for test storage... 00:30:08.689 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.689 13:55:57 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:08.689 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:08.689 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.689 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.689 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:08.689 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:08.689 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:08.689 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:08.689 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:30:08.689 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:08.689 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:08.689 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:08.689 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:08.689 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:08.689 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:08.689 13:55:57 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:08.690 13:55:57 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:08.951 13:55:57 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:08.951 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:08.951 13:55:57 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:08.951 #define SPDK_CONFIG_H 00:30:08.951 #define SPDK_CONFIG_APPS 1 00:30:08.951 #define SPDK_CONFIG_ARCH native 00:30:08.951 #undef SPDK_CONFIG_ASAN 00:30:08.951 #undef SPDK_CONFIG_AVAHI 00:30:08.951 #undef SPDK_CONFIG_CET 00:30:08.951 #define SPDK_CONFIG_COVERAGE 1 00:30:08.951 #define SPDK_CONFIG_CROSS_PREFIX 00:30:08.951 #define SPDK_CONFIG_CRYPTO 1 00:30:08.951 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:08.951 #undef SPDK_CONFIG_CUSTOMOCF 00:30:08.951 #undef SPDK_CONFIG_DAOS 00:30:08.951 #define SPDK_CONFIG_DAOS_DIR 00:30:08.951 #define SPDK_CONFIG_DEBUG 1 00:30:08.951 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:08.951 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:08.951 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:08.951 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:08.951 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:08.951 #undef SPDK_CONFIG_DPDK_UADK 00:30:08.951 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:08.951 #define SPDK_CONFIG_EXAMPLES 1 00:30:08.951 #undef SPDK_CONFIG_FC 00:30:08.951 #define SPDK_CONFIG_FC_PATH 00:30:08.951 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:08.951 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:08.951 #undef SPDK_CONFIG_FUSE 00:30:08.951 #undef SPDK_CONFIG_FUZZER 00:30:08.951 #define SPDK_CONFIG_FUZZER_LIB 00:30:08.951 #undef SPDK_CONFIG_GOLANG 00:30:08.951 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:08.951 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:08.951 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:08.951 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:08.951 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:08.951 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:08.951 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:08.951 #define SPDK_CONFIG_IDXD 1 00:30:08.951 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:08.951 #define SPDK_CONFIG_IPSEC_MB 1 00:30:08.951 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:08.951 #define SPDK_CONFIG_ISAL 1 00:30:08.951 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:08.951 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:08.951 #define SPDK_CONFIG_LIBDIR 00:30:08.951 #undef SPDK_CONFIG_LTO 00:30:08.951 #define SPDK_CONFIG_MAX_LCORES 128 00:30:08.951 #define SPDK_CONFIG_NVME_CUSE 1 00:30:08.951 #undef SPDK_CONFIG_OCF 00:30:08.951 #define SPDK_CONFIG_OCF_PATH 00:30:08.951 #define SPDK_CONFIG_OPENSSL_PATH 00:30:08.951 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:08.951 #define SPDK_CONFIG_PGO_DIR 00:30:08.951 #undef SPDK_CONFIG_PGO_USE 00:30:08.951 #define SPDK_CONFIG_PREFIX /usr/local 00:30:08.951 #undef SPDK_CONFIG_RAID5F 00:30:08.951 #undef SPDK_CONFIG_RBD 00:30:08.952 #define SPDK_CONFIG_RDMA 1 00:30:08.952 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:08.952 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:08.952 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:08.952 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:08.952 #define SPDK_CONFIG_SHARED 1 00:30:08.952 #undef SPDK_CONFIG_SMA 00:30:08.952 #define SPDK_CONFIG_TESTS 1 00:30:08.952 #undef SPDK_CONFIG_TSAN 00:30:08.952 #define SPDK_CONFIG_UBLK 1 00:30:08.952 #define SPDK_CONFIG_UBSAN 1 00:30:08.952 #undef SPDK_CONFIG_UNIT_TESTS 00:30:08.952 #undef SPDK_CONFIG_URING 00:30:08.952 #define SPDK_CONFIG_URING_PATH 00:30:08.952 #undef SPDK_CONFIG_URING_ZNS 00:30:08.952 #undef SPDK_CONFIG_USDT 00:30:08.952 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:08.952 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:08.952 #undef SPDK_CONFIG_VFIO_USER 00:30:08.952 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:08.952 #define SPDK_CONFIG_VHOST 1 00:30:08.952 #define SPDK_CONFIG_VIRTIO 1 00:30:08.952 #undef SPDK_CONFIG_VTUNE 00:30:08.952 #define SPDK_CONFIG_VTUNE_DIR 00:30:08.952 #define SPDK_CONFIG_WERROR 1 00:30:08.952 #define SPDK_CONFIG_WPDK_DIR 00:30:08.952 #undef SPDK_CONFIG_XNVME 00:30:08.952 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:08.952 13:55:57 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:08.952 13:55:57 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:08.952 13:55:57 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:08.952 13:55:57 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:08.952 13:55:57 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:08.952 13:55:57 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:08.952 13:55:57 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:08.952 13:55:57 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:30:08.952 13:55:57 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:08.952 13:55:57 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:30:08.952 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 600642 ]] 00:30:08.953 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 600642 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@1707 -- # set_test_storage 2147483648 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.pxsyeR 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.pxsyeR/tests/interrupt /tmp/spdk.pxsyeR 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88992739328 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5515776000 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47250882560 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=3375104 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892349440 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9355264 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253831680 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=425984 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:08.954 * Looking for test storage... 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88992739328 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7730368512 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.954 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@1709 -- # set -o errtrace 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@1710 -- # shopt -s extdebug 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@1711 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@1713 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@1714 -- # true 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@1716 -- # xtrace_fd 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=600683 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 600683 /var/tmp/spdk.sock 00:30:08.954 13:55:57 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 600683 ']' 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:08.954 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:08.955 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:08.955 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:08.955 13:55:57 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:08.955 [2024-07-12 13:55:57.476828] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:30:08.955 [2024-07-12 13:55:57.476897] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid600683 ] 00:30:09.213 [2024-07-12 13:55:57.607331] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:09.213 [2024-07-12 13:55:57.712335] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:09.213 [2024-07-12 13:55:57.712457] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:09.213 [2024-07-12 13:55:57.712459] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:09.213 [2024-07-12 13:55:57.786920] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:10.147 13:55:58 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:10.147 13:55:58 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:30:10.147 13:55:58 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:10.147 13:55:58 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:10.147 13:55:58 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:30:10.147 "name": "app_thread", 00:30:10.147 "id": 1, 00:30:10.147 "active_pollers": [], 00:30:10.147 "timed_pollers": [ 00:30:10.147 { 00:30:10.147 "name": "rpc_subsystem_poll_servers", 00:30:10.147 "id": 1, 00:30:10.147 "state": "waiting", 00:30:10.147 "run_count": 0, 00:30:10.147 "busy_count": 0, 00:30:10.147 "period_ticks": 9200000 00:30:10.147 } 00:30:10.147 ], 00:30:10.147 "paused_pollers": [] 00:30:10.147 }' 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:10.147 5000+0 records in 00:30:10.147 5000+0 records out 00:30:10.147 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0197073 s, 520 MB/s 00:30:10.147 13:55:58 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:10.405 AIO0 00:30:10.405 13:55:58 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:10.663 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:30:10.663 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:30:10.663 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:30:10.663 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:10.663 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:10.663 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:10.921 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:30:10.921 "name": "app_thread", 00:30:10.921 "id": 1, 00:30:10.921 "active_pollers": [], 00:30:10.921 "timed_pollers": [ 00:30:10.921 { 00:30:10.921 "name": "rpc_subsystem_poll_servers", 00:30:10.921 "id": 1, 00:30:10.921 "state": "waiting", 00:30:10.921 "run_count": 0, 00:30:10.921 "busy_count": 0, 00:30:10.921 "period_ticks": 9200000 00:30:10.921 } 00:30:10.921 ], 00:30:10.921 "paused_pollers": [] 00:30:10.921 }' 00:30:10.921 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:30:10.921 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:30:10.921 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:30:10.921 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:30:10.921 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:30:10.921 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:30:10.921 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:30:10.921 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 600683 00:30:10.921 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 600683 ']' 00:30:10.921 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 600683 00:30:10.921 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:30:10.921 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:10.921 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 600683 00:30:10.921 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:10.921 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:10.921 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 600683' 00:30:10.921 killing process with pid 600683 00:30:10.921 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 600683 00:30:10.921 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 600683 00:30:11.179 13:55:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:30:11.179 13:55:59 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:11.179 00:30:11.179 real 0m2.485s 00:30:11.179 user 0m1.587s 00:30:11.179 sys 0m0.665s 00:30:11.179 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:11.179 13:55:59 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:11.179 ************************************ 00:30:11.179 END TEST reap_unregistered_poller 00:30:11.179 ************************************ 00:30:11.179 13:55:59 -- common/autotest_common.sh@1142 -- # return 0 00:30:11.179 13:55:59 -- spdk/autotest.sh@198 -- # uname -s 00:30:11.179 13:55:59 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:30:11.179 13:55:59 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:30:11.179 13:55:59 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:30:11.179 13:55:59 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:30:11.179 13:55:59 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:30:11.179 13:55:59 -- spdk/autotest.sh@260 -- # timing_exit lib 00:30:11.180 13:55:59 -- common/autotest_common.sh@728 -- # xtrace_disable 00:30:11.180 13:55:59 -- common/autotest_common.sh@10 -- # set +x 00:30:11.180 13:55:59 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:30:11.180 13:55:59 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:11.180 13:55:59 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:11.180 13:55:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:11.180 13:55:59 -- common/autotest_common.sh@10 -- # set +x 00:30:11.438 ************************************ 00:30:11.439 START TEST compress_compdev 00:30:11.439 ************************************ 00:30:11.439 13:55:59 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:11.439 * Looking for test storage... 00:30:11.439 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:11.439 13:55:59 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:11.439 13:55:59 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:11.439 13:55:59 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:11.439 13:55:59 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:11.439 13:55:59 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:11.439 13:55:59 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:11.439 13:55:59 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:11.439 13:55:59 compress_compdev -- paths/export.sh@5 -- # export PATH 00:30:11.439 13:55:59 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:11.439 13:55:59 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:11.439 13:55:59 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:11.439 13:55:59 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:11.439 13:55:59 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:30:11.439 13:55:59 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:11.439 13:55:59 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:11.439 13:55:59 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=601124 00:30:11.439 13:55:59 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:11.439 13:55:59 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 601124 00:30:11.439 13:55:59 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 601124 ']' 00:30:11.439 13:55:59 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:11.439 13:55:59 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:11.439 13:55:59 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:11.439 13:55:59 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:11.439 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:11.439 13:55:59 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:11.439 13:55:59 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:11.439 [2024-07-12 13:55:59.987342] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:30:11.439 [2024-07-12 13:55:59.987416] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid601124 ] 00:30:11.698 [2024-07-12 13:56:00.126192] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:11.698 [2024-07-12 13:56:00.248216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:11.698 [2024-07-12 13:56:00.248220] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:12.636 [2024-07-12 13:56:01.209239] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:12.895 13:56:01 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:12.895 13:56:01 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:12.895 13:56:01 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:30:12.895 13:56:01 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:12.895 13:56:01 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:13.463 [2024-07-12 13:56:01.896981] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x19c0d00 PMD being used: compress_qat 00:30:13.463 13:56:01 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:13.463 13:56:01 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:13.463 13:56:01 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:13.463 13:56:01 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:13.463 13:56:01 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:13.463 13:56:01 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:13.463 13:56:01 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:13.722 13:56:02 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:13.981 [ 00:30:13.981 { 00:30:13.981 "name": "Nvme0n1", 00:30:13.981 "aliases": [ 00:30:13.981 "01000000-0000-0000-5cd2-e43197705251" 00:30:13.981 ], 00:30:13.981 "product_name": "NVMe disk", 00:30:13.981 "block_size": 512, 00:30:13.981 "num_blocks": 15002931888, 00:30:13.981 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:13.981 "assigned_rate_limits": { 00:30:13.981 "rw_ios_per_sec": 0, 00:30:13.981 "rw_mbytes_per_sec": 0, 00:30:13.981 "r_mbytes_per_sec": 0, 00:30:13.981 "w_mbytes_per_sec": 0 00:30:13.981 }, 00:30:13.981 "claimed": false, 00:30:13.981 "zoned": false, 00:30:13.981 "supported_io_types": { 00:30:13.982 "read": true, 00:30:13.982 "write": true, 00:30:13.982 "unmap": true, 00:30:13.982 "flush": true, 00:30:13.982 "reset": true, 00:30:13.982 "nvme_admin": true, 00:30:13.982 "nvme_io": true, 00:30:13.982 "nvme_io_md": false, 00:30:13.982 "write_zeroes": true, 00:30:13.982 "zcopy": false, 00:30:13.982 "get_zone_info": false, 00:30:13.982 "zone_management": false, 00:30:13.982 "zone_append": false, 00:30:13.982 "compare": false, 00:30:13.982 "compare_and_write": false, 00:30:13.982 "abort": true, 00:30:13.982 "seek_hole": false, 00:30:13.982 "seek_data": false, 00:30:13.982 "copy": false, 00:30:13.982 "nvme_iov_md": false 00:30:13.982 }, 00:30:13.982 "driver_specific": { 00:30:13.982 "nvme": [ 00:30:13.982 { 00:30:13.982 "pci_address": "0000:5e:00.0", 00:30:13.982 "trid": { 00:30:13.982 "trtype": "PCIe", 00:30:13.982 "traddr": "0000:5e:00.0" 00:30:13.982 }, 00:30:13.982 "ctrlr_data": { 00:30:13.982 "cntlid": 0, 00:30:13.982 "vendor_id": "0x8086", 00:30:13.982 "model_number": "INTEL SSDPF2KX076TZO", 00:30:13.982 "serial_number": "PHAC0301002G7P6CGN", 00:30:13.982 "firmware_revision": "JCV10200", 00:30:13.982 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:13.982 "oacs": { 00:30:13.982 "security": 1, 00:30:13.982 "format": 1, 00:30:13.982 "firmware": 1, 00:30:13.982 "ns_manage": 1 00:30:13.982 }, 00:30:13.982 "multi_ctrlr": false, 00:30:13.982 "ana_reporting": false 00:30:13.982 }, 00:30:13.982 "vs": { 00:30:13.982 "nvme_version": "1.3" 00:30:13.982 }, 00:30:13.982 "ns_data": { 00:30:13.982 "id": 1, 00:30:13.982 "can_share": false 00:30:13.982 }, 00:30:13.982 "security": { 00:30:13.982 "opal": true 00:30:13.982 } 00:30:13.982 } 00:30:13.982 ], 00:30:13.982 "mp_policy": "active_passive" 00:30:13.982 } 00:30:13.982 } 00:30:13.982 ] 00:30:13.982 13:56:02 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:13.982 13:56:02 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:14.241 [2024-07-12 13:56:02.692098] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x180f120 PMD being used: compress_qat 00:30:16.777 05e2820f-4e0c-418f-9ca8-6133098e9ecd 00:30:16.777 13:56:04 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:16.777 6ad418b7-b964-47cc-837b-2832cb6e9295 00:30:16.777 13:56:05 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:16.777 13:56:05 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:16.777 13:56:05 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:16.777 13:56:05 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:16.777 13:56:05 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:16.777 13:56:05 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:16.777 13:56:05 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:17.036 13:56:05 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:17.295 [ 00:30:17.295 { 00:30:17.296 "name": "6ad418b7-b964-47cc-837b-2832cb6e9295", 00:30:17.296 "aliases": [ 00:30:17.296 "lvs0/lv0" 00:30:17.296 ], 00:30:17.296 "product_name": "Logical Volume", 00:30:17.296 "block_size": 512, 00:30:17.296 "num_blocks": 204800, 00:30:17.296 "uuid": "6ad418b7-b964-47cc-837b-2832cb6e9295", 00:30:17.296 "assigned_rate_limits": { 00:30:17.296 "rw_ios_per_sec": 0, 00:30:17.296 "rw_mbytes_per_sec": 0, 00:30:17.296 "r_mbytes_per_sec": 0, 00:30:17.296 "w_mbytes_per_sec": 0 00:30:17.296 }, 00:30:17.296 "claimed": false, 00:30:17.296 "zoned": false, 00:30:17.296 "supported_io_types": { 00:30:17.296 "read": true, 00:30:17.296 "write": true, 00:30:17.296 "unmap": true, 00:30:17.296 "flush": false, 00:30:17.296 "reset": true, 00:30:17.296 "nvme_admin": false, 00:30:17.296 "nvme_io": false, 00:30:17.296 "nvme_io_md": false, 00:30:17.296 "write_zeroes": true, 00:30:17.296 "zcopy": false, 00:30:17.296 "get_zone_info": false, 00:30:17.296 "zone_management": false, 00:30:17.296 "zone_append": false, 00:30:17.296 "compare": false, 00:30:17.296 "compare_and_write": false, 00:30:17.296 "abort": false, 00:30:17.296 "seek_hole": true, 00:30:17.296 "seek_data": true, 00:30:17.296 "copy": false, 00:30:17.296 "nvme_iov_md": false 00:30:17.296 }, 00:30:17.296 "driver_specific": { 00:30:17.296 "lvol": { 00:30:17.296 "lvol_store_uuid": "05e2820f-4e0c-418f-9ca8-6133098e9ecd", 00:30:17.296 "base_bdev": "Nvme0n1", 00:30:17.296 "thin_provision": true, 00:30:17.296 "num_allocated_clusters": 0, 00:30:17.296 "snapshot": false, 00:30:17.296 "clone": false, 00:30:17.296 "esnap_clone": false 00:30:17.296 } 00:30:17.296 } 00:30:17.296 } 00:30:17.296 ] 00:30:17.296 13:56:05 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:17.296 13:56:05 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:17.296 13:56:05 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:17.555 [2024-07-12 13:56:05.957291] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:17.555 COMP_lvs0/lv0 00:30:17.555 13:56:05 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:17.555 13:56:05 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:17.555 13:56:05 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:17.555 13:56:05 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:17.555 13:56:05 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:17.555 13:56:05 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:17.555 13:56:05 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:17.813 13:56:06 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:18.073 [ 00:30:18.073 { 00:30:18.073 "name": "COMP_lvs0/lv0", 00:30:18.073 "aliases": [ 00:30:18.073 "e47a5f78-e164-5583-ac0c-3b761e116f87" 00:30:18.073 ], 00:30:18.073 "product_name": "compress", 00:30:18.073 "block_size": 512, 00:30:18.073 "num_blocks": 200704, 00:30:18.073 "uuid": "e47a5f78-e164-5583-ac0c-3b761e116f87", 00:30:18.073 "assigned_rate_limits": { 00:30:18.073 "rw_ios_per_sec": 0, 00:30:18.073 "rw_mbytes_per_sec": 0, 00:30:18.073 "r_mbytes_per_sec": 0, 00:30:18.073 "w_mbytes_per_sec": 0 00:30:18.073 }, 00:30:18.073 "claimed": false, 00:30:18.073 "zoned": false, 00:30:18.073 "supported_io_types": { 00:30:18.073 "read": true, 00:30:18.073 "write": true, 00:30:18.073 "unmap": false, 00:30:18.073 "flush": false, 00:30:18.073 "reset": false, 00:30:18.073 "nvme_admin": false, 00:30:18.073 "nvme_io": false, 00:30:18.073 "nvme_io_md": false, 00:30:18.073 "write_zeroes": true, 00:30:18.073 "zcopy": false, 00:30:18.073 "get_zone_info": false, 00:30:18.073 "zone_management": false, 00:30:18.073 "zone_append": false, 00:30:18.073 "compare": false, 00:30:18.073 "compare_and_write": false, 00:30:18.073 "abort": false, 00:30:18.073 "seek_hole": false, 00:30:18.073 "seek_data": false, 00:30:18.073 "copy": false, 00:30:18.073 "nvme_iov_md": false 00:30:18.073 }, 00:30:18.073 "driver_specific": { 00:30:18.073 "compress": { 00:30:18.073 "name": "COMP_lvs0/lv0", 00:30:18.073 "base_bdev_name": "6ad418b7-b964-47cc-837b-2832cb6e9295" 00:30:18.073 } 00:30:18.073 } 00:30:18.073 } 00:30:18.073 ] 00:30:18.073 13:56:06 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:18.073 13:56:06 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:18.073 [2024-07-12 13:56:06.628447] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f3e501b15c0 PMD being used: compress_qat 00:30:18.073 [2024-07-12 13:56:06.631787] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1bb3740 PMD being used: compress_qat 00:30:18.073 Running I/O for 3 seconds... 00:30:21.364 00:30:21.364 Latency(us) 00:30:21.364 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:21.364 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:21.364 Verification LBA range: start 0x0 length 0x3100 00:30:21.364 COMP_lvs0/lv0 : 3.01 1677.31 6.55 0.00 0.00 18974.87 2464.72 17210.32 00:30:21.364 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:21.364 Verification LBA range: start 0x3100 length 0x3100 00:30:21.364 COMP_lvs0/lv0 : 3.01 1777.93 6.95 0.00 0.00 17884.54 1210.99 14816.83 00:30:21.364 =================================================================================================================== 00:30:21.364 Total : 3455.24 13.50 0.00 0.00 18414.04 1210.99 17210.32 00:30:21.364 0 00:30:21.364 13:56:09 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:21.364 13:56:09 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:21.623 13:56:09 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:21.882 13:56:10 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:21.882 13:56:10 compress_compdev -- compress/compress.sh@78 -- # killprocess 601124 00:30:21.882 13:56:10 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 601124 ']' 00:30:21.882 13:56:10 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 601124 00:30:21.882 13:56:10 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:21.882 13:56:10 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:21.882 13:56:10 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 601124 00:30:21.882 13:56:10 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:21.882 13:56:10 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:21.882 13:56:10 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 601124' 00:30:21.882 killing process with pid 601124 00:30:21.882 13:56:10 compress_compdev -- common/autotest_common.sh@967 -- # kill 601124 00:30:21.882 Received shutdown signal, test time was about 3.000000 seconds 00:30:21.882 00:30:21.882 Latency(us) 00:30:21.882 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:21.882 =================================================================================================================== 00:30:21.882 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:21.882 13:56:10 compress_compdev -- common/autotest_common.sh@972 -- # wait 601124 00:30:25.235 13:56:13 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:25.235 13:56:13 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:25.235 13:56:13 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=602740 00:30:25.235 13:56:13 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:25.235 13:56:13 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:25.235 13:56:13 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 602740 00:30:25.235 13:56:13 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 602740 ']' 00:30:25.235 13:56:13 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:25.235 13:56:13 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:25.235 13:56:13 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:25.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:25.235 13:56:13 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:25.235 13:56:13 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:25.235 [2024-07-12 13:56:13.180998] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:30:25.235 [2024-07-12 13:56:13.181073] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid602740 ] 00:30:25.235 [2024-07-12 13:56:13.315720] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:25.235 [2024-07-12 13:56:13.431870] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:25.235 [2024-07-12 13:56:13.431874] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:26.171 [2024-07-12 13:56:14.393226] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:26.171 13:56:14 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:26.171 13:56:14 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:26.171 13:56:14 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:30:26.171 13:56:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:26.171 13:56:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:26.738 [2024-07-12 13:56:15.080808] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1d09d00 PMD being used: compress_qat 00:30:26.738 13:56:15 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:26.738 13:56:15 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:26.738 13:56:15 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:26.738 13:56:15 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:26.738 13:56:15 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:26.738 13:56:15 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:26.738 13:56:15 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:26.996 13:56:15 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:27.255 [ 00:30:27.255 { 00:30:27.255 "name": "Nvme0n1", 00:30:27.255 "aliases": [ 00:30:27.255 "01000000-0000-0000-5cd2-e43197705251" 00:30:27.255 ], 00:30:27.255 "product_name": "NVMe disk", 00:30:27.255 "block_size": 512, 00:30:27.255 "num_blocks": 15002931888, 00:30:27.255 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:27.255 "assigned_rate_limits": { 00:30:27.255 "rw_ios_per_sec": 0, 00:30:27.255 "rw_mbytes_per_sec": 0, 00:30:27.255 "r_mbytes_per_sec": 0, 00:30:27.255 "w_mbytes_per_sec": 0 00:30:27.255 }, 00:30:27.255 "claimed": false, 00:30:27.255 "zoned": false, 00:30:27.255 "supported_io_types": { 00:30:27.255 "read": true, 00:30:27.255 "write": true, 00:30:27.255 "unmap": true, 00:30:27.255 "flush": true, 00:30:27.255 "reset": true, 00:30:27.255 "nvme_admin": true, 00:30:27.255 "nvme_io": true, 00:30:27.255 "nvme_io_md": false, 00:30:27.255 "write_zeroes": true, 00:30:27.255 "zcopy": false, 00:30:27.255 "get_zone_info": false, 00:30:27.255 "zone_management": false, 00:30:27.255 "zone_append": false, 00:30:27.255 "compare": false, 00:30:27.255 "compare_and_write": false, 00:30:27.255 "abort": true, 00:30:27.255 "seek_hole": false, 00:30:27.255 "seek_data": false, 00:30:27.255 "copy": false, 00:30:27.255 "nvme_iov_md": false 00:30:27.255 }, 00:30:27.255 "driver_specific": { 00:30:27.255 "nvme": [ 00:30:27.255 { 00:30:27.255 "pci_address": "0000:5e:00.0", 00:30:27.255 "trid": { 00:30:27.255 "trtype": "PCIe", 00:30:27.255 "traddr": "0000:5e:00.0" 00:30:27.255 }, 00:30:27.255 "ctrlr_data": { 00:30:27.255 "cntlid": 0, 00:30:27.255 "vendor_id": "0x8086", 00:30:27.255 "model_number": "INTEL SSDPF2KX076TZO", 00:30:27.255 "serial_number": "PHAC0301002G7P6CGN", 00:30:27.255 "firmware_revision": "JCV10200", 00:30:27.255 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:27.255 "oacs": { 00:30:27.255 "security": 1, 00:30:27.255 "format": 1, 00:30:27.255 "firmware": 1, 00:30:27.255 "ns_manage": 1 00:30:27.255 }, 00:30:27.255 "multi_ctrlr": false, 00:30:27.255 "ana_reporting": false 00:30:27.255 }, 00:30:27.255 "vs": { 00:30:27.255 "nvme_version": "1.3" 00:30:27.255 }, 00:30:27.255 "ns_data": { 00:30:27.255 "id": 1, 00:30:27.255 "can_share": false 00:30:27.255 }, 00:30:27.255 "security": { 00:30:27.255 "opal": true 00:30:27.255 } 00:30:27.255 } 00:30:27.255 ], 00:30:27.255 "mp_policy": "active_passive" 00:30:27.255 } 00:30:27.255 } 00:30:27.255 ] 00:30:27.255 13:56:15 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:27.255 13:56:15 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:27.514 [2024-07-12 13:56:15.879948] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b58120 PMD being used: compress_qat 00:30:30.046 34a80362-3c74-462b-a4e8-fe11b71edbc0 00:30:30.046 13:56:18 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:30.046 9e3c5ee6-baec-44e9-b04e-d55355afa6b7 00:30:30.046 13:56:18 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:30.046 13:56:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:30.046 13:56:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:30.046 13:56:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:30.046 13:56:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:30.046 13:56:18 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:30.046 13:56:18 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:30.304 13:56:18 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:30.305 [ 00:30:30.305 { 00:30:30.305 "name": "9e3c5ee6-baec-44e9-b04e-d55355afa6b7", 00:30:30.305 "aliases": [ 00:30:30.305 "lvs0/lv0" 00:30:30.305 ], 00:30:30.305 "product_name": "Logical Volume", 00:30:30.305 "block_size": 512, 00:30:30.305 "num_blocks": 204800, 00:30:30.305 "uuid": "9e3c5ee6-baec-44e9-b04e-d55355afa6b7", 00:30:30.305 "assigned_rate_limits": { 00:30:30.305 "rw_ios_per_sec": 0, 00:30:30.305 "rw_mbytes_per_sec": 0, 00:30:30.305 "r_mbytes_per_sec": 0, 00:30:30.305 "w_mbytes_per_sec": 0 00:30:30.305 }, 00:30:30.305 "claimed": false, 00:30:30.305 "zoned": false, 00:30:30.305 "supported_io_types": { 00:30:30.305 "read": true, 00:30:30.305 "write": true, 00:30:30.305 "unmap": true, 00:30:30.305 "flush": false, 00:30:30.305 "reset": true, 00:30:30.305 "nvme_admin": false, 00:30:30.305 "nvme_io": false, 00:30:30.305 "nvme_io_md": false, 00:30:30.305 "write_zeroes": true, 00:30:30.305 "zcopy": false, 00:30:30.305 "get_zone_info": false, 00:30:30.305 "zone_management": false, 00:30:30.305 "zone_append": false, 00:30:30.305 "compare": false, 00:30:30.305 "compare_and_write": false, 00:30:30.305 "abort": false, 00:30:30.305 "seek_hole": true, 00:30:30.305 "seek_data": true, 00:30:30.305 "copy": false, 00:30:30.305 "nvme_iov_md": false 00:30:30.305 }, 00:30:30.305 "driver_specific": { 00:30:30.305 "lvol": { 00:30:30.305 "lvol_store_uuid": "34a80362-3c74-462b-a4e8-fe11b71edbc0", 00:30:30.305 "base_bdev": "Nvme0n1", 00:30:30.305 "thin_provision": true, 00:30:30.305 "num_allocated_clusters": 0, 00:30:30.305 "snapshot": false, 00:30:30.305 "clone": false, 00:30:30.305 "esnap_clone": false 00:30:30.305 } 00:30:30.305 } 00:30:30.305 } 00:30:30.305 ] 00:30:30.305 13:56:18 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:30.305 13:56:18 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:30.305 13:56:18 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:30.563 [2024-07-12 13:56:19.104987] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:30.563 COMP_lvs0/lv0 00:30:30.563 13:56:19 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:30.563 13:56:19 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:30.563 13:56:19 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:30.563 13:56:19 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:30.563 13:56:19 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:30.563 13:56:19 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:30.563 13:56:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:30.822 13:56:19 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:31.080 [ 00:30:31.080 { 00:30:31.080 "name": "COMP_lvs0/lv0", 00:30:31.080 "aliases": [ 00:30:31.080 "650a141c-ccd5-54de-8487-ba0a50009971" 00:30:31.080 ], 00:30:31.080 "product_name": "compress", 00:30:31.080 "block_size": 512, 00:30:31.080 "num_blocks": 200704, 00:30:31.080 "uuid": "650a141c-ccd5-54de-8487-ba0a50009971", 00:30:31.080 "assigned_rate_limits": { 00:30:31.080 "rw_ios_per_sec": 0, 00:30:31.080 "rw_mbytes_per_sec": 0, 00:30:31.080 "r_mbytes_per_sec": 0, 00:30:31.080 "w_mbytes_per_sec": 0 00:30:31.081 }, 00:30:31.081 "claimed": false, 00:30:31.081 "zoned": false, 00:30:31.081 "supported_io_types": { 00:30:31.081 "read": true, 00:30:31.081 "write": true, 00:30:31.081 "unmap": false, 00:30:31.081 "flush": false, 00:30:31.081 "reset": false, 00:30:31.081 "nvme_admin": false, 00:30:31.081 "nvme_io": false, 00:30:31.081 "nvme_io_md": false, 00:30:31.081 "write_zeroes": true, 00:30:31.081 "zcopy": false, 00:30:31.081 "get_zone_info": false, 00:30:31.081 "zone_management": false, 00:30:31.081 "zone_append": false, 00:30:31.081 "compare": false, 00:30:31.081 "compare_and_write": false, 00:30:31.081 "abort": false, 00:30:31.081 "seek_hole": false, 00:30:31.081 "seek_data": false, 00:30:31.081 "copy": false, 00:30:31.081 "nvme_iov_md": false 00:30:31.081 }, 00:30:31.081 "driver_specific": { 00:30:31.081 "compress": { 00:30:31.081 "name": "COMP_lvs0/lv0", 00:30:31.081 "base_bdev_name": "9e3c5ee6-baec-44e9-b04e-d55355afa6b7" 00:30:31.081 } 00:30:31.081 } 00:30:31.081 } 00:30:31.081 ] 00:30:31.081 13:56:19 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:31.081 13:56:19 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:31.081 [2024-07-12 13:56:19.629470] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f11fc0 PMD being used: compress_qat 00:30:31.081 [2024-07-12 13:56:19.633679] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f837819bc10 PMD being used: compress_qat 00:30:31.081 Running I/O for 3 seconds... 00:30:34.370 00:30:34.370 Latency(us) 00:30:34.370 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:34.370 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:34.370 Verification LBA range: start 0x0 length 0x3100 00:30:34.370 COMP_lvs0/lv0 : 3.01 2788.38 10.89 0.00 0.00 11388.65 954.55 10257.81 00:30:34.370 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:34.370 Verification LBA range: start 0x3100 length 0x3100 00:30:34.370 COMP_lvs0/lv0 : 3.01 2624.41 10.25 0.00 0.00 12062.87 954.55 10542.75 00:30:34.370 =================================================================================================================== 00:30:34.370 Total : 5412.80 21.14 0.00 0.00 11715.55 954.55 10542.75 00:30:34.370 0 00:30:34.370 13:56:22 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:34.370 13:56:22 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:34.370 13:56:22 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:34.629 13:56:23 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:34.629 13:56:23 compress_compdev -- compress/compress.sh@78 -- # killprocess 602740 00:30:34.629 13:56:23 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 602740 ']' 00:30:34.629 13:56:23 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 602740 00:30:34.629 13:56:23 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:34.888 13:56:23 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:34.888 13:56:23 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 602740 00:30:34.888 13:56:23 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:34.888 13:56:23 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:34.888 13:56:23 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 602740' 00:30:34.888 killing process with pid 602740 00:30:34.888 13:56:23 compress_compdev -- common/autotest_common.sh@967 -- # kill 602740 00:30:34.888 Received shutdown signal, test time was about 3.000000 seconds 00:30:34.888 00:30:34.888 Latency(us) 00:30:34.888 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:34.888 =================================================================================================================== 00:30:34.888 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:34.888 13:56:23 compress_compdev -- common/autotest_common.sh@972 -- # wait 602740 00:30:38.176 13:56:26 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:38.176 13:56:26 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:38.176 13:56:26 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=604446 00:30:38.176 13:56:26 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:38.176 13:56:26 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:38.176 13:56:26 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 604446 00:30:38.176 13:56:26 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 604446 ']' 00:30:38.176 13:56:26 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:38.176 13:56:26 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:38.176 13:56:26 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:38.176 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:38.176 13:56:26 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:38.176 13:56:26 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:38.176 [2024-07-12 13:56:26.186281] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:30:38.176 [2024-07-12 13:56:26.186360] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid604446 ] 00:30:38.176 [2024-07-12 13:56:26.323993] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:38.176 [2024-07-12 13:56:26.458254] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:38.176 [2024-07-12 13:56:26.458261] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:39.113 [2024-07-12 13:56:27.434501] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:39.113 13:56:27 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:39.113 13:56:27 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:39.113 13:56:27 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:30:39.113 13:56:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:39.113 13:56:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:39.681 [2024-07-12 13:56:28.120513] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1825d00 PMD being used: compress_qat 00:30:39.681 13:56:28 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:39.681 13:56:28 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:39.681 13:56:28 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:39.681 13:56:28 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:39.681 13:56:28 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:39.681 13:56:28 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:39.681 13:56:28 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:39.940 13:56:28 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:40.200 [ 00:30:40.200 { 00:30:40.200 "name": "Nvme0n1", 00:30:40.200 "aliases": [ 00:30:40.200 "01000000-0000-0000-5cd2-e43197705251" 00:30:40.200 ], 00:30:40.200 "product_name": "NVMe disk", 00:30:40.200 "block_size": 512, 00:30:40.200 "num_blocks": 15002931888, 00:30:40.200 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:40.200 "assigned_rate_limits": { 00:30:40.200 "rw_ios_per_sec": 0, 00:30:40.200 "rw_mbytes_per_sec": 0, 00:30:40.200 "r_mbytes_per_sec": 0, 00:30:40.200 "w_mbytes_per_sec": 0 00:30:40.200 }, 00:30:40.200 "claimed": false, 00:30:40.200 "zoned": false, 00:30:40.200 "supported_io_types": { 00:30:40.200 "read": true, 00:30:40.200 "write": true, 00:30:40.200 "unmap": true, 00:30:40.200 "flush": true, 00:30:40.200 "reset": true, 00:30:40.200 "nvme_admin": true, 00:30:40.200 "nvme_io": true, 00:30:40.200 "nvme_io_md": false, 00:30:40.200 "write_zeroes": true, 00:30:40.200 "zcopy": false, 00:30:40.200 "get_zone_info": false, 00:30:40.200 "zone_management": false, 00:30:40.200 "zone_append": false, 00:30:40.200 "compare": false, 00:30:40.200 "compare_and_write": false, 00:30:40.200 "abort": true, 00:30:40.200 "seek_hole": false, 00:30:40.200 "seek_data": false, 00:30:40.200 "copy": false, 00:30:40.200 "nvme_iov_md": false 00:30:40.200 }, 00:30:40.200 "driver_specific": { 00:30:40.200 "nvme": [ 00:30:40.200 { 00:30:40.200 "pci_address": "0000:5e:00.0", 00:30:40.200 "trid": { 00:30:40.200 "trtype": "PCIe", 00:30:40.200 "traddr": "0000:5e:00.0" 00:30:40.200 }, 00:30:40.200 "ctrlr_data": { 00:30:40.200 "cntlid": 0, 00:30:40.200 "vendor_id": "0x8086", 00:30:40.200 "model_number": "INTEL SSDPF2KX076TZO", 00:30:40.200 "serial_number": "PHAC0301002G7P6CGN", 00:30:40.200 "firmware_revision": "JCV10200", 00:30:40.200 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:40.200 "oacs": { 00:30:40.200 "security": 1, 00:30:40.200 "format": 1, 00:30:40.200 "firmware": 1, 00:30:40.200 "ns_manage": 1 00:30:40.200 }, 00:30:40.200 "multi_ctrlr": false, 00:30:40.200 "ana_reporting": false 00:30:40.200 }, 00:30:40.200 "vs": { 00:30:40.200 "nvme_version": "1.3" 00:30:40.200 }, 00:30:40.200 "ns_data": { 00:30:40.200 "id": 1, 00:30:40.200 "can_share": false 00:30:40.200 }, 00:30:40.200 "security": { 00:30:40.200 "opal": true 00:30:40.200 } 00:30:40.200 } 00:30:40.200 ], 00:30:40.200 "mp_policy": "active_passive" 00:30:40.200 } 00:30:40.200 } 00:30:40.200 ] 00:30:40.200 13:56:28 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:40.200 13:56:28 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:40.460 [2024-07-12 13:56:28.911578] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1674120 PMD being used: compress_qat 00:30:42.994 c5ece229-0ae5-4f4b-958f-5440f37357b6 00:30:42.994 13:56:31 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:42.994 79ab06c4-508c-4360-a546-33c2d3fe1a41 00:30:42.994 13:56:31 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:42.994 13:56:31 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:42.994 13:56:31 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:42.994 13:56:31 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:42.994 13:56:31 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:42.994 13:56:31 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:42.994 13:56:31 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:43.253 13:56:31 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:43.512 [ 00:30:43.512 { 00:30:43.512 "name": "79ab06c4-508c-4360-a546-33c2d3fe1a41", 00:30:43.512 "aliases": [ 00:30:43.512 "lvs0/lv0" 00:30:43.512 ], 00:30:43.512 "product_name": "Logical Volume", 00:30:43.512 "block_size": 512, 00:30:43.512 "num_blocks": 204800, 00:30:43.512 "uuid": "79ab06c4-508c-4360-a546-33c2d3fe1a41", 00:30:43.512 "assigned_rate_limits": { 00:30:43.512 "rw_ios_per_sec": 0, 00:30:43.512 "rw_mbytes_per_sec": 0, 00:30:43.512 "r_mbytes_per_sec": 0, 00:30:43.512 "w_mbytes_per_sec": 0 00:30:43.512 }, 00:30:43.512 "claimed": false, 00:30:43.512 "zoned": false, 00:30:43.512 "supported_io_types": { 00:30:43.512 "read": true, 00:30:43.512 "write": true, 00:30:43.512 "unmap": true, 00:30:43.512 "flush": false, 00:30:43.512 "reset": true, 00:30:43.512 "nvme_admin": false, 00:30:43.512 "nvme_io": false, 00:30:43.512 "nvme_io_md": false, 00:30:43.512 "write_zeroes": true, 00:30:43.512 "zcopy": false, 00:30:43.512 "get_zone_info": false, 00:30:43.512 "zone_management": false, 00:30:43.512 "zone_append": false, 00:30:43.512 "compare": false, 00:30:43.512 "compare_and_write": false, 00:30:43.512 "abort": false, 00:30:43.512 "seek_hole": true, 00:30:43.512 "seek_data": true, 00:30:43.512 "copy": false, 00:30:43.512 "nvme_iov_md": false 00:30:43.512 }, 00:30:43.512 "driver_specific": { 00:30:43.512 "lvol": { 00:30:43.512 "lvol_store_uuid": "c5ece229-0ae5-4f4b-958f-5440f37357b6", 00:30:43.512 "base_bdev": "Nvme0n1", 00:30:43.512 "thin_provision": true, 00:30:43.512 "num_allocated_clusters": 0, 00:30:43.512 "snapshot": false, 00:30:43.512 "clone": false, 00:30:43.512 "esnap_clone": false 00:30:43.513 } 00:30:43.513 } 00:30:43.513 } 00:30:43.513 ] 00:30:43.513 13:56:31 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:43.513 13:56:31 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:43.513 13:56:31 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:43.772 [2024-07-12 13:56:32.160910] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:43.772 COMP_lvs0/lv0 00:30:43.772 13:56:32 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:43.772 13:56:32 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:43.772 13:56:32 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:43.772 13:56:32 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:43.772 13:56:32 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:43.772 13:56:32 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:43.772 13:56:32 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:44.032 13:56:32 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:44.292 [ 00:30:44.292 { 00:30:44.292 "name": "COMP_lvs0/lv0", 00:30:44.292 "aliases": [ 00:30:44.292 "20b27303-19a8-5704-888a-ff9033bbc543" 00:30:44.292 ], 00:30:44.292 "product_name": "compress", 00:30:44.292 "block_size": 4096, 00:30:44.292 "num_blocks": 25088, 00:30:44.292 "uuid": "20b27303-19a8-5704-888a-ff9033bbc543", 00:30:44.292 "assigned_rate_limits": { 00:30:44.292 "rw_ios_per_sec": 0, 00:30:44.292 "rw_mbytes_per_sec": 0, 00:30:44.292 "r_mbytes_per_sec": 0, 00:30:44.292 "w_mbytes_per_sec": 0 00:30:44.292 }, 00:30:44.292 "claimed": false, 00:30:44.292 "zoned": false, 00:30:44.292 "supported_io_types": { 00:30:44.292 "read": true, 00:30:44.292 "write": true, 00:30:44.292 "unmap": false, 00:30:44.292 "flush": false, 00:30:44.292 "reset": false, 00:30:44.292 "nvme_admin": false, 00:30:44.292 "nvme_io": false, 00:30:44.292 "nvme_io_md": false, 00:30:44.292 "write_zeroes": true, 00:30:44.292 "zcopy": false, 00:30:44.292 "get_zone_info": false, 00:30:44.292 "zone_management": false, 00:30:44.292 "zone_append": false, 00:30:44.292 "compare": false, 00:30:44.292 "compare_and_write": false, 00:30:44.292 "abort": false, 00:30:44.292 "seek_hole": false, 00:30:44.292 "seek_data": false, 00:30:44.292 "copy": false, 00:30:44.292 "nvme_iov_md": false 00:30:44.292 }, 00:30:44.292 "driver_specific": { 00:30:44.292 "compress": { 00:30:44.292 "name": "COMP_lvs0/lv0", 00:30:44.292 "base_bdev_name": "79ab06c4-508c-4360-a546-33c2d3fe1a41" 00:30:44.292 } 00:30:44.292 } 00:30:44.292 } 00:30:44.292 ] 00:30:44.292 13:56:32 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:44.292 13:56:32 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:44.292 [2024-07-12 13:56:32.820018] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa7541b15c0 PMD being used: compress_qat 00:30:44.292 [2024-07-12 13:56:32.823326] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1a189a0 PMD being used: compress_qat 00:30:44.292 Running I/O for 3 seconds... 00:30:47.578 00:30:47.578 Latency(us) 00:30:47.578 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:47.578 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:47.578 Verification LBA range: start 0x0 length 0x3100 00:30:47.578 COMP_lvs0/lv0 : 3.01 1673.91 6.54 0.00 0.00 19011.84 2137.04 16982.37 00:30:47.578 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:47.578 Verification LBA range: start 0x3100 length 0x3100 00:30:47.578 COMP_lvs0/lv0 : 3.01 1782.17 6.96 0.00 0.00 17845.94 1353.46 14930.81 00:30:47.578 =================================================================================================================== 00:30:47.578 Total : 3456.08 13.50 0.00 0.00 18410.85 1353.46 16982.37 00:30:47.578 0 00:30:47.578 13:56:35 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:47.578 13:56:35 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:47.578 13:56:36 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:47.836 13:56:36 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:47.836 13:56:36 compress_compdev -- compress/compress.sh@78 -- # killprocess 604446 00:30:47.836 13:56:36 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 604446 ']' 00:30:47.836 13:56:36 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 604446 00:30:47.836 13:56:36 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:47.836 13:56:36 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:47.836 13:56:36 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 604446 00:30:47.836 13:56:36 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:47.836 13:56:36 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:47.836 13:56:36 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 604446' 00:30:47.836 killing process with pid 604446 00:30:47.836 13:56:36 compress_compdev -- common/autotest_common.sh@967 -- # kill 604446 00:30:47.836 Received shutdown signal, test time was about 3.000000 seconds 00:30:47.836 00:30:47.836 Latency(us) 00:30:47.836 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:47.836 =================================================================================================================== 00:30:47.836 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:47.836 13:56:36 compress_compdev -- common/autotest_common.sh@972 -- # wait 604446 00:30:51.125 13:56:39 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:30:51.125 13:56:39 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:51.125 13:56:39 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=606111 00:30:51.125 13:56:39 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:51.125 13:56:39 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:30:51.125 13:56:39 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 606111 00:30:51.125 13:56:39 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 606111 ']' 00:30:51.125 13:56:39 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:51.125 13:56:39 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:51.125 13:56:39 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:51.125 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:51.125 13:56:39 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:51.125 13:56:39 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:51.125 [2024-07-12 13:56:39.298876] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:30:51.125 [2024-07-12 13:56:39.298955] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid606111 ] 00:30:51.125 [2024-07-12 13:56:39.443544] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:51.125 [2024-07-12 13:56:39.582830] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:51.125 [2024-07-12 13:56:39.582941] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:51.125 [2024-07-12 13:56:39.582948] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:52.062 [2024-07-12 13:56:40.342655] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:52.062 13:56:40 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:52.062 13:56:40 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:52.062 13:56:40 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:30:52.062 13:56:40 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:52.062 13:56:40 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:52.631 [2024-07-12 13:56:40.999891] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x249a920 PMD being used: compress_qat 00:30:52.631 13:56:41 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:52.631 13:56:41 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:52.631 13:56:41 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:52.631 13:56:41 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:52.631 13:56:41 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:52.631 13:56:41 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:52.631 13:56:41 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:53.199 13:56:41 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:53.199 [ 00:30:53.199 { 00:30:53.199 "name": "Nvme0n1", 00:30:53.199 "aliases": [ 00:30:53.199 "01000000-0000-0000-5cd2-e43197705251" 00:30:53.199 ], 00:30:53.199 "product_name": "NVMe disk", 00:30:53.199 "block_size": 512, 00:30:53.199 "num_blocks": 15002931888, 00:30:53.199 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:53.199 "assigned_rate_limits": { 00:30:53.199 "rw_ios_per_sec": 0, 00:30:53.199 "rw_mbytes_per_sec": 0, 00:30:53.199 "r_mbytes_per_sec": 0, 00:30:53.199 "w_mbytes_per_sec": 0 00:30:53.199 }, 00:30:53.199 "claimed": false, 00:30:53.199 "zoned": false, 00:30:53.199 "supported_io_types": { 00:30:53.199 "read": true, 00:30:53.199 "write": true, 00:30:53.199 "unmap": true, 00:30:53.199 "flush": true, 00:30:53.199 "reset": true, 00:30:53.199 "nvme_admin": true, 00:30:53.199 "nvme_io": true, 00:30:53.199 "nvme_io_md": false, 00:30:53.199 "write_zeroes": true, 00:30:53.199 "zcopy": false, 00:30:53.199 "get_zone_info": false, 00:30:53.199 "zone_management": false, 00:30:53.199 "zone_append": false, 00:30:53.199 "compare": false, 00:30:53.199 "compare_and_write": false, 00:30:53.199 "abort": true, 00:30:53.199 "seek_hole": false, 00:30:53.199 "seek_data": false, 00:30:53.199 "copy": false, 00:30:53.199 "nvme_iov_md": false 00:30:53.199 }, 00:30:53.199 "driver_specific": { 00:30:53.199 "nvme": [ 00:30:53.199 { 00:30:53.199 "pci_address": "0000:5e:00.0", 00:30:53.199 "trid": { 00:30:53.199 "trtype": "PCIe", 00:30:53.199 "traddr": "0000:5e:00.0" 00:30:53.199 }, 00:30:53.199 "ctrlr_data": { 00:30:53.199 "cntlid": 0, 00:30:53.199 "vendor_id": "0x8086", 00:30:53.199 "model_number": "INTEL SSDPF2KX076TZO", 00:30:53.199 "serial_number": "PHAC0301002G7P6CGN", 00:30:53.199 "firmware_revision": "JCV10200", 00:30:53.199 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:53.199 "oacs": { 00:30:53.199 "security": 1, 00:30:53.199 "format": 1, 00:30:53.199 "firmware": 1, 00:30:53.199 "ns_manage": 1 00:30:53.199 }, 00:30:53.199 "multi_ctrlr": false, 00:30:53.199 "ana_reporting": false 00:30:53.199 }, 00:30:53.199 "vs": { 00:30:53.199 "nvme_version": "1.3" 00:30:53.199 }, 00:30:53.199 "ns_data": { 00:30:53.199 "id": 1, 00:30:53.199 "can_share": false 00:30:53.200 }, 00:30:53.200 "security": { 00:30:53.200 "opal": true 00:30:53.200 } 00:30:53.200 } 00:30:53.200 ], 00:30:53.200 "mp_policy": "active_passive" 00:30:53.200 } 00:30:53.200 } 00:30:53.200 ] 00:30:53.200 13:56:41 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:53.200 13:56:41 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:53.459 [2024-07-12 13:56:41.906166] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22e8d50 PMD being used: compress_qat 00:30:55.996 0859b82d-5350-489a-8749-80ceac448754 00:30:55.996 13:56:44 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:55.996 47e25d20-5190-413f-beb3-3f44e0245095 00:30:55.996 13:56:44 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:55.996 13:56:44 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:55.996 13:56:44 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:55.996 13:56:44 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:55.996 13:56:44 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:55.996 13:56:44 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:55.996 13:56:44 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:56.256 13:56:44 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:56.256 [ 00:30:56.256 { 00:30:56.256 "name": "47e25d20-5190-413f-beb3-3f44e0245095", 00:30:56.256 "aliases": [ 00:30:56.256 "lvs0/lv0" 00:30:56.256 ], 00:30:56.256 "product_name": "Logical Volume", 00:30:56.256 "block_size": 512, 00:30:56.256 "num_blocks": 204800, 00:30:56.256 "uuid": "47e25d20-5190-413f-beb3-3f44e0245095", 00:30:56.256 "assigned_rate_limits": { 00:30:56.256 "rw_ios_per_sec": 0, 00:30:56.256 "rw_mbytes_per_sec": 0, 00:30:56.256 "r_mbytes_per_sec": 0, 00:30:56.256 "w_mbytes_per_sec": 0 00:30:56.256 }, 00:30:56.256 "claimed": false, 00:30:56.256 "zoned": false, 00:30:56.256 "supported_io_types": { 00:30:56.256 "read": true, 00:30:56.256 "write": true, 00:30:56.256 "unmap": true, 00:30:56.256 "flush": false, 00:30:56.256 "reset": true, 00:30:56.256 "nvme_admin": false, 00:30:56.256 "nvme_io": false, 00:30:56.256 "nvme_io_md": false, 00:30:56.256 "write_zeroes": true, 00:30:56.256 "zcopy": false, 00:30:56.256 "get_zone_info": false, 00:30:56.256 "zone_management": false, 00:30:56.256 "zone_append": false, 00:30:56.256 "compare": false, 00:30:56.256 "compare_and_write": false, 00:30:56.256 "abort": false, 00:30:56.256 "seek_hole": true, 00:30:56.256 "seek_data": true, 00:30:56.256 "copy": false, 00:30:56.256 "nvme_iov_md": false 00:30:56.256 }, 00:30:56.256 "driver_specific": { 00:30:56.256 "lvol": { 00:30:56.256 "lvol_store_uuid": "0859b82d-5350-489a-8749-80ceac448754", 00:30:56.256 "base_bdev": "Nvme0n1", 00:30:56.256 "thin_provision": true, 00:30:56.256 "num_allocated_clusters": 0, 00:30:56.256 "snapshot": false, 00:30:56.256 "clone": false, 00:30:56.256 "esnap_clone": false 00:30:56.256 } 00:30:56.256 } 00:30:56.256 } 00:30:56.256 ] 00:30:56.256 13:56:44 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:56.256 13:56:44 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:56.256 13:56:44 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:56.515 [2024-07-12 13:56:44.937891] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:56.515 COMP_lvs0/lv0 00:30:56.515 13:56:44 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:56.515 13:56:44 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:56.515 13:56:44 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:56.515 13:56:44 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:56.515 13:56:44 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:56.515 13:56:44 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:56.515 13:56:44 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:56.775 13:56:45 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:56.775 [ 00:30:56.775 { 00:30:56.775 "name": "COMP_lvs0/lv0", 00:30:56.775 "aliases": [ 00:30:56.775 "fe1b081c-f30b-5f57-96d2-f0e5d76a6863" 00:30:56.775 ], 00:30:56.775 "product_name": "compress", 00:30:56.775 "block_size": 512, 00:30:56.775 "num_blocks": 200704, 00:30:56.775 "uuid": "fe1b081c-f30b-5f57-96d2-f0e5d76a6863", 00:30:56.775 "assigned_rate_limits": { 00:30:56.775 "rw_ios_per_sec": 0, 00:30:56.775 "rw_mbytes_per_sec": 0, 00:30:56.775 "r_mbytes_per_sec": 0, 00:30:56.775 "w_mbytes_per_sec": 0 00:30:56.775 }, 00:30:56.775 "claimed": false, 00:30:56.775 "zoned": false, 00:30:56.775 "supported_io_types": { 00:30:56.775 "read": true, 00:30:56.775 "write": true, 00:30:56.775 "unmap": false, 00:30:56.775 "flush": false, 00:30:56.775 "reset": false, 00:30:56.775 "nvme_admin": false, 00:30:56.775 "nvme_io": false, 00:30:56.775 "nvme_io_md": false, 00:30:56.775 "write_zeroes": true, 00:30:56.775 "zcopy": false, 00:30:56.775 "get_zone_info": false, 00:30:56.775 "zone_management": false, 00:30:56.775 "zone_append": false, 00:30:56.775 "compare": false, 00:30:56.775 "compare_and_write": false, 00:30:56.775 "abort": false, 00:30:56.775 "seek_hole": false, 00:30:56.775 "seek_data": false, 00:30:56.775 "copy": false, 00:30:56.775 "nvme_iov_md": false 00:30:56.775 }, 00:30:56.775 "driver_specific": { 00:30:56.775 "compress": { 00:30:56.775 "name": "COMP_lvs0/lv0", 00:30:56.775 "base_bdev_name": "47e25d20-5190-413f-beb3-3f44e0245095" 00:30:56.775 } 00:30:56.775 } 00:30:56.775 } 00:30:56.775 ] 00:30:56.775 13:56:45 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:56.775 13:56:45 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:57.059 [2024-07-12 13:56:45.458411] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fee041b1350 PMD being used: compress_qat 00:30:57.059 I/O targets: 00:30:57.059 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:30:57.059 00:30:57.059 00:30:57.059 CUnit - A unit testing framework for C - Version 2.1-3 00:30:57.059 http://cunit.sourceforge.net/ 00:30:57.059 00:30:57.059 00:30:57.059 Suite: bdevio tests on: COMP_lvs0/lv0 00:30:57.059 Test: blockdev write read block ...passed 00:30:57.059 Test: blockdev write zeroes read block ...passed 00:30:57.059 Test: blockdev write zeroes read no split ...passed 00:30:57.059 Test: blockdev write zeroes read split ...passed 00:30:57.059 Test: blockdev write zeroes read split partial ...passed 00:30:57.059 Test: blockdev reset ...[2024-07-12 13:56:45.561959] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:30:57.059 passed 00:30:57.059 Test: blockdev write read 8 blocks ...passed 00:30:57.059 Test: blockdev write read size > 128k ...passed 00:30:57.059 Test: blockdev write read invalid size ...passed 00:30:57.059 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:57.059 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:57.059 Test: blockdev write read max offset ...passed 00:30:57.059 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:57.059 Test: blockdev writev readv 8 blocks ...passed 00:30:57.059 Test: blockdev writev readv 30 x 1block ...passed 00:30:57.059 Test: blockdev writev readv block ...passed 00:30:57.059 Test: blockdev writev readv size > 128k ...passed 00:30:57.059 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:57.059 Test: blockdev comparev and writev ...passed 00:30:57.059 Test: blockdev nvme passthru rw ...passed 00:30:57.059 Test: blockdev nvme passthru vendor specific ...passed 00:30:57.059 Test: blockdev nvme admin passthru ...passed 00:30:57.059 Test: blockdev copy ...passed 00:30:57.059 00:30:57.059 Run Summary: Type Total Ran Passed Failed Inactive 00:30:57.059 suites 1 1 n/a 0 0 00:30:57.059 tests 23 23 23 0 0 00:30:57.059 asserts 130 130 130 0 n/a 00:30:57.059 00:30:57.059 Elapsed time = 0.235 seconds 00:30:57.059 0 00:30:57.059 13:56:45 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:30:57.059 13:56:45 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:57.349 13:56:45 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:57.607 13:56:46 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:30:57.607 13:56:46 compress_compdev -- compress/compress.sh@62 -- # killprocess 606111 00:30:57.607 13:56:46 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 606111 ']' 00:30:57.607 13:56:46 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 606111 00:30:57.607 13:56:46 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:57.607 13:56:46 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:57.607 13:56:46 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 606111 00:30:57.607 13:56:46 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:57.607 13:56:46 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:57.607 13:56:46 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 606111' 00:30:57.607 killing process with pid 606111 00:30:57.607 13:56:46 compress_compdev -- common/autotest_common.sh@967 -- # kill 606111 00:30:57.607 13:56:46 compress_compdev -- common/autotest_common.sh@972 -- # wait 606111 00:31:00.893 13:56:48 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:00.893 13:56:48 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:00.893 00:31:00.893 real 0m49.116s 00:31:00.893 user 1m52.165s 00:31:00.893 sys 0m6.380s 00:31:00.893 13:56:48 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:00.893 13:56:48 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:00.893 ************************************ 00:31:00.893 END TEST compress_compdev 00:31:00.893 ************************************ 00:31:00.893 13:56:48 -- common/autotest_common.sh@1142 -- # return 0 00:31:00.893 13:56:48 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:00.893 13:56:48 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:00.893 13:56:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:00.893 13:56:48 -- common/autotest_common.sh@10 -- # set +x 00:31:00.893 ************************************ 00:31:00.893 START TEST compress_isal 00:31:00.893 ************************************ 00:31:00.893 13:56:48 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:00.893 * Looking for test storage... 00:31:00.893 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:00.893 13:56:49 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:00.893 13:56:49 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:00.893 13:56:49 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:00.893 13:56:49 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:00.893 13:56:49 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:00.893 13:56:49 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:00.893 13:56:49 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:00.893 13:56:49 compress_isal -- paths/export.sh@5 -- # export PATH 00:31:00.893 13:56:49 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@47 -- # : 0 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:00.893 13:56:49 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:00.893 13:56:49 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:00.893 13:56:49 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:00.893 13:56:49 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:31:00.893 13:56:49 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:00.894 13:56:49 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:00.894 13:56:49 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=607418 00:31:00.894 13:56:49 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:00.894 13:56:49 compress_isal -- compress/compress.sh@73 -- # waitforlisten 607418 00:31:00.894 13:56:49 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 607418 ']' 00:31:00.894 13:56:49 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:00.894 13:56:49 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:00.894 13:56:49 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:00.894 13:56:49 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:00.894 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:00.894 13:56:49 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:00.894 13:56:49 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:00.894 [2024-07-12 13:56:49.167751] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:31:00.894 [2024-07-12 13:56:49.167822] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid607418 ] 00:31:00.894 [2024-07-12 13:56:49.301250] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:00.894 [2024-07-12 13:56:49.417738] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:00.894 [2024-07-12 13:56:49.417743] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:01.832 13:56:50 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:01.832 13:56:50 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:01.832 13:56:50 compress_isal -- compress/compress.sh@74 -- # create_vols 00:31:01.832 13:56:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:01.832 13:56:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:02.400 13:56:50 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:02.400 13:56:50 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:02.400 13:56:50 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:02.400 13:56:50 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:02.400 13:56:50 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:02.400 13:56:50 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:02.400 13:56:50 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:02.968 13:56:51 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:03.228 [ 00:31:03.228 { 00:31:03.228 "name": "Nvme0n1", 00:31:03.228 "aliases": [ 00:31:03.228 "01000000-0000-0000-5cd2-e43197705251" 00:31:03.228 ], 00:31:03.228 "product_name": "NVMe disk", 00:31:03.228 "block_size": 512, 00:31:03.228 "num_blocks": 15002931888, 00:31:03.228 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:03.228 "assigned_rate_limits": { 00:31:03.228 "rw_ios_per_sec": 0, 00:31:03.228 "rw_mbytes_per_sec": 0, 00:31:03.228 "r_mbytes_per_sec": 0, 00:31:03.228 "w_mbytes_per_sec": 0 00:31:03.228 }, 00:31:03.228 "claimed": false, 00:31:03.228 "zoned": false, 00:31:03.228 "supported_io_types": { 00:31:03.228 "read": true, 00:31:03.228 "write": true, 00:31:03.228 "unmap": true, 00:31:03.228 "flush": true, 00:31:03.228 "reset": true, 00:31:03.228 "nvme_admin": true, 00:31:03.228 "nvme_io": true, 00:31:03.228 "nvme_io_md": false, 00:31:03.228 "write_zeroes": true, 00:31:03.228 "zcopy": false, 00:31:03.228 "get_zone_info": false, 00:31:03.228 "zone_management": false, 00:31:03.228 "zone_append": false, 00:31:03.228 "compare": false, 00:31:03.228 "compare_and_write": false, 00:31:03.228 "abort": true, 00:31:03.228 "seek_hole": false, 00:31:03.228 "seek_data": false, 00:31:03.228 "copy": false, 00:31:03.228 "nvme_iov_md": false 00:31:03.228 }, 00:31:03.228 "driver_specific": { 00:31:03.228 "nvme": [ 00:31:03.228 { 00:31:03.228 "pci_address": "0000:5e:00.0", 00:31:03.228 "trid": { 00:31:03.228 "trtype": "PCIe", 00:31:03.228 "traddr": "0000:5e:00.0" 00:31:03.228 }, 00:31:03.228 "ctrlr_data": { 00:31:03.228 "cntlid": 0, 00:31:03.228 "vendor_id": "0x8086", 00:31:03.228 "model_number": "INTEL SSDPF2KX076TZO", 00:31:03.228 "serial_number": "PHAC0301002G7P6CGN", 00:31:03.228 "firmware_revision": "JCV10200", 00:31:03.228 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:03.228 "oacs": { 00:31:03.228 "security": 1, 00:31:03.228 "format": 1, 00:31:03.228 "firmware": 1, 00:31:03.228 "ns_manage": 1 00:31:03.228 }, 00:31:03.228 "multi_ctrlr": false, 00:31:03.228 "ana_reporting": false 00:31:03.228 }, 00:31:03.228 "vs": { 00:31:03.228 "nvme_version": "1.3" 00:31:03.228 }, 00:31:03.228 "ns_data": { 00:31:03.228 "id": 1, 00:31:03.228 "can_share": false 00:31:03.228 }, 00:31:03.228 "security": { 00:31:03.228 "opal": true 00:31:03.228 } 00:31:03.228 } 00:31:03.228 ], 00:31:03.228 "mp_policy": "active_passive" 00:31:03.228 } 00:31:03.228 } 00:31:03.228 ] 00:31:03.228 13:56:51 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:03.228 13:56:51 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:05.762 938abaac-8d36-4bf2-9c9d-7f1a1379dfb8 00:31:05.762 13:56:54 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:06.021 5efec494-b985-4a99-a29f-ad4d57cab985 00:31:06.021 13:56:54 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:06.021 13:56:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:06.021 13:56:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:06.021 13:56:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:06.021 13:56:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:06.021 13:56:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:06.021 13:56:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:06.281 13:56:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:06.541 [ 00:31:06.541 { 00:31:06.541 "name": "5efec494-b985-4a99-a29f-ad4d57cab985", 00:31:06.541 "aliases": [ 00:31:06.541 "lvs0/lv0" 00:31:06.541 ], 00:31:06.541 "product_name": "Logical Volume", 00:31:06.541 "block_size": 512, 00:31:06.541 "num_blocks": 204800, 00:31:06.541 "uuid": "5efec494-b985-4a99-a29f-ad4d57cab985", 00:31:06.541 "assigned_rate_limits": { 00:31:06.541 "rw_ios_per_sec": 0, 00:31:06.541 "rw_mbytes_per_sec": 0, 00:31:06.541 "r_mbytes_per_sec": 0, 00:31:06.541 "w_mbytes_per_sec": 0 00:31:06.541 }, 00:31:06.541 "claimed": false, 00:31:06.541 "zoned": false, 00:31:06.541 "supported_io_types": { 00:31:06.541 "read": true, 00:31:06.541 "write": true, 00:31:06.541 "unmap": true, 00:31:06.541 "flush": false, 00:31:06.541 "reset": true, 00:31:06.541 "nvme_admin": false, 00:31:06.541 "nvme_io": false, 00:31:06.541 "nvme_io_md": false, 00:31:06.541 "write_zeroes": true, 00:31:06.541 "zcopy": false, 00:31:06.541 "get_zone_info": false, 00:31:06.541 "zone_management": false, 00:31:06.541 "zone_append": false, 00:31:06.541 "compare": false, 00:31:06.541 "compare_and_write": false, 00:31:06.541 "abort": false, 00:31:06.541 "seek_hole": true, 00:31:06.541 "seek_data": true, 00:31:06.541 "copy": false, 00:31:06.541 "nvme_iov_md": false 00:31:06.541 }, 00:31:06.541 "driver_specific": { 00:31:06.541 "lvol": { 00:31:06.541 "lvol_store_uuid": "938abaac-8d36-4bf2-9c9d-7f1a1379dfb8", 00:31:06.541 "base_bdev": "Nvme0n1", 00:31:06.541 "thin_provision": true, 00:31:06.541 "num_allocated_clusters": 0, 00:31:06.541 "snapshot": false, 00:31:06.541 "clone": false, 00:31:06.541 "esnap_clone": false 00:31:06.541 } 00:31:06.541 } 00:31:06.541 } 00:31:06.541 ] 00:31:06.541 13:56:55 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:06.541 13:56:55 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:06.541 13:56:55 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:07.111 [2024-07-12 13:56:55.522620] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:07.111 COMP_lvs0/lv0 00:31:07.111 13:56:55 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:07.111 13:56:55 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:07.111 13:56:55 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:07.111 13:56:55 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:07.111 13:56:55 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:07.111 13:56:55 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:07.111 13:56:55 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:07.679 13:56:56 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:08.248 [ 00:31:08.248 { 00:31:08.248 "name": "COMP_lvs0/lv0", 00:31:08.248 "aliases": [ 00:31:08.248 "a81a222f-8c0d-5832-bfb1-9829fdf5706b" 00:31:08.248 ], 00:31:08.248 "product_name": "compress", 00:31:08.248 "block_size": 512, 00:31:08.248 "num_blocks": 200704, 00:31:08.248 "uuid": "a81a222f-8c0d-5832-bfb1-9829fdf5706b", 00:31:08.248 "assigned_rate_limits": { 00:31:08.248 "rw_ios_per_sec": 0, 00:31:08.248 "rw_mbytes_per_sec": 0, 00:31:08.248 "r_mbytes_per_sec": 0, 00:31:08.248 "w_mbytes_per_sec": 0 00:31:08.248 }, 00:31:08.248 "claimed": false, 00:31:08.248 "zoned": false, 00:31:08.248 "supported_io_types": { 00:31:08.248 "read": true, 00:31:08.248 "write": true, 00:31:08.248 "unmap": false, 00:31:08.248 "flush": false, 00:31:08.248 "reset": false, 00:31:08.248 "nvme_admin": false, 00:31:08.248 "nvme_io": false, 00:31:08.248 "nvme_io_md": false, 00:31:08.248 "write_zeroes": true, 00:31:08.248 "zcopy": false, 00:31:08.248 "get_zone_info": false, 00:31:08.248 "zone_management": false, 00:31:08.248 "zone_append": false, 00:31:08.248 "compare": false, 00:31:08.248 "compare_and_write": false, 00:31:08.248 "abort": false, 00:31:08.248 "seek_hole": false, 00:31:08.248 "seek_data": false, 00:31:08.248 "copy": false, 00:31:08.248 "nvme_iov_md": false 00:31:08.248 }, 00:31:08.248 "driver_specific": { 00:31:08.248 "compress": { 00:31:08.248 "name": "COMP_lvs0/lv0", 00:31:08.248 "base_bdev_name": "5efec494-b985-4a99-a29f-ad4d57cab985" 00:31:08.248 } 00:31:08.248 } 00:31:08.248 } 00:31:08.248 ] 00:31:08.248 13:56:56 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:08.248 13:56:56 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:08.248 Running I/O for 3 seconds... 00:31:11.538 00:31:11.538 Latency(us) 00:31:11.538 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:11.538 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:11.538 Verification LBA range: start 0x0 length 0x3100 00:31:11.538 COMP_lvs0/lv0 : 3.02 1267.47 4.95 0.00 0.00 25118.92 2550.21 21541.40 00:31:11.538 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:11.538 Verification LBA range: start 0x3100 length 0x3100 00:31:11.538 COMP_lvs0/lv0 : 3.01 1269.75 4.96 0.00 0.00 25058.67 1517.30 20515.62 00:31:11.538 =================================================================================================================== 00:31:11.538 Total : 2537.22 9.91 0.00 0.00 25088.78 1517.30 21541.40 00:31:11.538 0 00:31:11.538 13:56:59 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:11.538 13:56:59 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:11.797 13:57:00 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:12.363 13:57:00 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:12.363 13:57:00 compress_isal -- compress/compress.sh@78 -- # killprocess 607418 00:31:12.363 13:57:00 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 607418 ']' 00:31:12.363 13:57:00 compress_isal -- common/autotest_common.sh@952 -- # kill -0 607418 00:31:12.363 13:57:00 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:12.363 13:57:00 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:12.363 13:57:00 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 607418 00:31:12.363 13:57:00 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:12.363 13:57:00 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:12.363 13:57:00 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 607418' 00:31:12.363 killing process with pid 607418 00:31:12.363 13:57:00 compress_isal -- common/autotest_common.sh@967 -- # kill 607418 00:31:12.363 Received shutdown signal, test time was about 3.000000 seconds 00:31:12.363 00:31:12.363 Latency(us) 00:31:12.363 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:12.363 =================================================================================================================== 00:31:12.363 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:12.363 13:57:00 compress_isal -- common/autotest_common.sh@972 -- # wait 607418 00:31:15.650 13:57:03 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:15.650 13:57:03 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:15.650 13:57:03 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=609486 00:31:15.650 13:57:03 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:15.650 13:57:03 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:15.650 13:57:03 compress_isal -- compress/compress.sh@73 -- # waitforlisten 609486 00:31:15.650 13:57:03 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 609486 ']' 00:31:15.650 13:57:03 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:15.650 13:57:03 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:15.650 13:57:03 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:15.650 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:15.650 13:57:03 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:15.650 13:57:03 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:15.650 [2024-07-12 13:57:03.723457] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:31:15.650 [2024-07-12 13:57:03.723542] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid609486 ] 00:31:15.650 [2024-07-12 13:57:03.873917] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:15.650 [2024-07-12 13:57:04.018974] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:15.650 [2024-07-12 13:57:04.018983] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:15.650 13:57:04 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:15.650 13:57:04 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:15.650 13:57:04 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:31:15.650 13:57:04 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:15.650 13:57:04 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:16.587 13:57:05 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:16.587 13:57:05 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:16.587 13:57:05 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:16.587 13:57:05 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:16.587 13:57:05 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:16.587 13:57:05 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:16.587 13:57:05 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:16.847 13:57:05 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:17.106 [ 00:31:17.106 { 00:31:17.106 "name": "Nvme0n1", 00:31:17.106 "aliases": [ 00:31:17.106 "01000000-0000-0000-5cd2-e43197705251" 00:31:17.106 ], 00:31:17.106 "product_name": "NVMe disk", 00:31:17.106 "block_size": 512, 00:31:17.106 "num_blocks": 15002931888, 00:31:17.106 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:17.106 "assigned_rate_limits": { 00:31:17.106 "rw_ios_per_sec": 0, 00:31:17.106 "rw_mbytes_per_sec": 0, 00:31:17.106 "r_mbytes_per_sec": 0, 00:31:17.106 "w_mbytes_per_sec": 0 00:31:17.106 }, 00:31:17.106 "claimed": false, 00:31:17.106 "zoned": false, 00:31:17.106 "supported_io_types": { 00:31:17.106 "read": true, 00:31:17.106 "write": true, 00:31:17.106 "unmap": true, 00:31:17.106 "flush": true, 00:31:17.106 "reset": true, 00:31:17.106 "nvme_admin": true, 00:31:17.106 "nvme_io": true, 00:31:17.106 "nvme_io_md": false, 00:31:17.106 "write_zeroes": true, 00:31:17.106 "zcopy": false, 00:31:17.106 "get_zone_info": false, 00:31:17.106 "zone_management": false, 00:31:17.106 "zone_append": false, 00:31:17.106 "compare": false, 00:31:17.106 "compare_and_write": false, 00:31:17.106 "abort": true, 00:31:17.106 "seek_hole": false, 00:31:17.106 "seek_data": false, 00:31:17.106 "copy": false, 00:31:17.106 "nvme_iov_md": false 00:31:17.106 }, 00:31:17.106 "driver_specific": { 00:31:17.106 "nvme": [ 00:31:17.106 { 00:31:17.106 "pci_address": "0000:5e:00.0", 00:31:17.106 "trid": { 00:31:17.106 "trtype": "PCIe", 00:31:17.106 "traddr": "0000:5e:00.0" 00:31:17.106 }, 00:31:17.106 "ctrlr_data": { 00:31:17.106 "cntlid": 0, 00:31:17.106 "vendor_id": "0x8086", 00:31:17.106 "model_number": "INTEL SSDPF2KX076TZO", 00:31:17.106 "serial_number": "PHAC0301002G7P6CGN", 00:31:17.106 "firmware_revision": "JCV10200", 00:31:17.106 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:17.106 "oacs": { 00:31:17.106 "security": 1, 00:31:17.106 "format": 1, 00:31:17.106 "firmware": 1, 00:31:17.106 "ns_manage": 1 00:31:17.106 }, 00:31:17.106 "multi_ctrlr": false, 00:31:17.106 "ana_reporting": false 00:31:17.106 }, 00:31:17.106 "vs": { 00:31:17.106 "nvme_version": "1.3" 00:31:17.106 }, 00:31:17.106 "ns_data": { 00:31:17.106 "id": 1, 00:31:17.106 "can_share": false 00:31:17.106 }, 00:31:17.106 "security": { 00:31:17.106 "opal": true 00:31:17.106 } 00:31:17.106 } 00:31:17.106 ], 00:31:17.106 "mp_policy": "active_passive" 00:31:17.106 } 00:31:17.106 } 00:31:17.106 ] 00:31:17.106 13:57:05 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:17.106 13:57:05 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:19.643 3ae0aa33-80d7-4cf9-819b-665b17ea6607 00:31:19.643 13:57:08 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:19.902 d2f3ab9b-e4dd-42f3-862a-2107acabbf43 00:31:19.902 13:57:08 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:19.902 13:57:08 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:19.902 13:57:08 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:19.902 13:57:08 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:19.902 13:57:08 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:19.902 13:57:08 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:19.902 13:57:08 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:20.161 13:57:08 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:20.161 [ 00:31:20.161 { 00:31:20.161 "name": "d2f3ab9b-e4dd-42f3-862a-2107acabbf43", 00:31:20.161 "aliases": [ 00:31:20.161 "lvs0/lv0" 00:31:20.161 ], 00:31:20.161 "product_name": "Logical Volume", 00:31:20.161 "block_size": 512, 00:31:20.161 "num_blocks": 204800, 00:31:20.161 "uuid": "d2f3ab9b-e4dd-42f3-862a-2107acabbf43", 00:31:20.161 "assigned_rate_limits": { 00:31:20.161 "rw_ios_per_sec": 0, 00:31:20.161 "rw_mbytes_per_sec": 0, 00:31:20.161 "r_mbytes_per_sec": 0, 00:31:20.161 "w_mbytes_per_sec": 0 00:31:20.161 }, 00:31:20.161 "claimed": false, 00:31:20.161 "zoned": false, 00:31:20.161 "supported_io_types": { 00:31:20.161 "read": true, 00:31:20.161 "write": true, 00:31:20.161 "unmap": true, 00:31:20.161 "flush": false, 00:31:20.161 "reset": true, 00:31:20.161 "nvme_admin": false, 00:31:20.161 "nvme_io": false, 00:31:20.161 "nvme_io_md": false, 00:31:20.161 "write_zeroes": true, 00:31:20.161 "zcopy": false, 00:31:20.161 "get_zone_info": false, 00:31:20.161 "zone_management": false, 00:31:20.161 "zone_append": false, 00:31:20.161 "compare": false, 00:31:20.161 "compare_and_write": false, 00:31:20.161 "abort": false, 00:31:20.161 "seek_hole": true, 00:31:20.161 "seek_data": true, 00:31:20.161 "copy": false, 00:31:20.161 "nvme_iov_md": false 00:31:20.161 }, 00:31:20.161 "driver_specific": { 00:31:20.161 "lvol": { 00:31:20.161 "lvol_store_uuid": "3ae0aa33-80d7-4cf9-819b-665b17ea6607", 00:31:20.161 "base_bdev": "Nvme0n1", 00:31:20.161 "thin_provision": true, 00:31:20.161 "num_allocated_clusters": 0, 00:31:20.161 "snapshot": false, 00:31:20.161 "clone": false, 00:31:20.161 "esnap_clone": false 00:31:20.161 } 00:31:20.161 } 00:31:20.161 } 00:31:20.161 ] 00:31:20.161 13:57:08 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:20.161 13:57:08 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:20.161 13:57:08 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:20.420 [2024-07-12 13:57:08.876859] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:20.420 COMP_lvs0/lv0 00:31:20.420 13:57:08 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:20.420 13:57:08 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:20.420 13:57:08 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:20.420 13:57:08 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:20.420 13:57:08 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:20.420 13:57:08 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:20.420 13:57:08 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:20.680 13:57:09 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:20.939 [ 00:31:20.939 { 00:31:20.939 "name": "COMP_lvs0/lv0", 00:31:20.939 "aliases": [ 00:31:20.939 "632dc891-1c31-544f-9c31-a215ee9e0b34" 00:31:20.939 ], 00:31:20.939 "product_name": "compress", 00:31:20.939 "block_size": 512, 00:31:20.939 "num_blocks": 200704, 00:31:20.939 "uuid": "632dc891-1c31-544f-9c31-a215ee9e0b34", 00:31:20.939 "assigned_rate_limits": { 00:31:20.939 "rw_ios_per_sec": 0, 00:31:20.939 "rw_mbytes_per_sec": 0, 00:31:20.939 "r_mbytes_per_sec": 0, 00:31:20.939 "w_mbytes_per_sec": 0 00:31:20.939 }, 00:31:20.939 "claimed": false, 00:31:20.939 "zoned": false, 00:31:20.939 "supported_io_types": { 00:31:20.939 "read": true, 00:31:20.939 "write": true, 00:31:20.939 "unmap": false, 00:31:20.939 "flush": false, 00:31:20.939 "reset": false, 00:31:20.939 "nvme_admin": false, 00:31:20.939 "nvme_io": false, 00:31:20.939 "nvme_io_md": false, 00:31:20.939 "write_zeroes": true, 00:31:20.939 "zcopy": false, 00:31:20.939 "get_zone_info": false, 00:31:20.939 "zone_management": false, 00:31:20.939 "zone_append": false, 00:31:20.939 "compare": false, 00:31:20.939 "compare_and_write": false, 00:31:20.939 "abort": false, 00:31:20.939 "seek_hole": false, 00:31:20.939 "seek_data": false, 00:31:20.939 "copy": false, 00:31:20.939 "nvme_iov_md": false 00:31:20.939 }, 00:31:20.939 "driver_specific": { 00:31:20.939 "compress": { 00:31:20.939 "name": "COMP_lvs0/lv0", 00:31:20.939 "base_bdev_name": "d2f3ab9b-e4dd-42f3-862a-2107acabbf43" 00:31:20.939 } 00:31:20.939 } 00:31:20.939 } 00:31:20.939 ] 00:31:20.939 13:57:09 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:20.939 13:57:09 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:21.198 Running I/O for 3 seconds... 00:31:24.488 00:31:24.488 Latency(us) 00:31:24.488 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:24.488 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:24.488 Verification LBA range: start 0x0 length 0x3100 00:31:24.488 COMP_lvs0/lv0 : 3.01 1252.01 4.89 0.00 0.00 25445.46 2350.75 28379.94 00:31:24.488 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:24.488 Verification LBA range: start 0x3100 length 0x3100 00:31:24.488 COMP_lvs0/lv0 : 3.01 1254.50 4.90 0.00 0.00 25368.84 1403.33 27924.03 00:31:24.488 =================================================================================================================== 00:31:24.488 Total : 2506.51 9.79 0.00 0.00 25407.11 1403.33 28379.94 00:31:24.488 0 00:31:24.488 13:57:12 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:24.488 13:57:12 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:24.488 13:57:12 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:24.745 13:57:13 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:24.746 13:57:13 compress_isal -- compress/compress.sh@78 -- # killprocess 609486 00:31:24.746 13:57:13 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 609486 ']' 00:31:24.746 13:57:13 compress_isal -- common/autotest_common.sh@952 -- # kill -0 609486 00:31:24.746 13:57:13 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:24.746 13:57:13 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:24.746 13:57:13 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 609486 00:31:24.746 13:57:13 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:24.746 13:57:13 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:24.746 13:57:13 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 609486' 00:31:24.746 killing process with pid 609486 00:31:24.746 13:57:13 compress_isal -- common/autotest_common.sh@967 -- # kill 609486 00:31:24.746 Received shutdown signal, test time was about 3.000000 seconds 00:31:24.746 00:31:24.746 Latency(us) 00:31:24.746 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:24.746 =================================================================================================================== 00:31:24.746 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:24.746 13:57:13 compress_isal -- common/autotest_common.sh@972 -- # wait 609486 00:31:28.029 13:57:15 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:31:28.029 13:57:15 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:28.029 13:57:15 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=611407 00:31:28.029 13:57:15 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:28.029 13:57:15 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:28.029 13:57:15 compress_isal -- compress/compress.sh@73 -- # waitforlisten 611407 00:31:28.029 13:57:15 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 611407 ']' 00:31:28.029 13:57:15 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:28.029 13:57:15 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:28.029 13:57:15 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:28.029 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:28.029 13:57:15 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:28.029 13:57:15 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:28.029 [2024-07-12 13:57:16.029673] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:31:28.029 [2024-07-12 13:57:16.029751] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid611407 ] 00:31:28.029 [2024-07-12 13:57:16.163591] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:28.029 [2024-07-12 13:57:16.280086] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:28.029 [2024-07-12 13:57:16.280092] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:28.596 13:57:16 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:28.596 13:57:16 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:28.596 13:57:16 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:31:28.596 13:57:16 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:28.596 13:57:16 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:29.163 13:57:17 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:29.163 13:57:17 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:29.163 13:57:17 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:29.163 13:57:17 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:29.163 13:57:17 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:29.163 13:57:17 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:29.163 13:57:17 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:29.423 13:57:17 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:30.113 [ 00:31:30.114 { 00:31:30.114 "name": "Nvme0n1", 00:31:30.114 "aliases": [ 00:31:30.114 "01000000-0000-0000-5cd2-e43197705251" 00:31:30.114 ], 00:31:30.114 "product_name": "NVMe disk", 00:31:30.114 "block_size": 512, 00:31:30.114 "num_blocks": 15002931888, 00:31:30.114 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:30.114 "assigned_rate_limits": { 00:31:30.114 "rw_ios_per_sec": 0, 00:31:30.114 "rw_mbytes_per_sec": 0, 00:31:30.114 "r_mbytes_per_sec": 0, 00:31:30.114 "w_mbytes_per_sec": 0 00:31:30.114 }, 00:31:30.114 "claimed": false, 00:31:30.114 "zoned": false, 00:31:30.114 "supported_io_types": { 00:31:30.114 "read": true, 00:31:30.114 "write": true, 00:31:30.114 "unmap": true, 00:31:30.114 "flush": true, 00:31:30.114 "reset": true, 00:31:30.114 "nvme_admin": true, 00:31:30.114 "nvme_io": true, 00:31:30.114 "nvme_io_md": false, 00:31:30.114 "write_zeroes": true, 00:31:30.114 "zcopy": false, 00:31:30.114 "get_zone_info": false, 00:31:30.114 "zone_management": false, 00:31:30.114 "zone_append": false, 00:31:30.114 "compare": false, 00:31:30.114 "compare_and_write": false, 00:31:30.114 "abort": true, 00:31:30.114 "seek_hole": false, 00:31:30.114 "seek_data": false, 00:31:30.114 "copy": false, 00:31:30.114 "nvme_iov_md": false 00:31:30.114 }, 00:31:30.114 "driver_specific": { 00:31:30.114 "nvme": [ 00:31:30.114 { 00:31:30.114 "pci_address": "0000:5e:00.0", 00:31:30.114 "trid": { 00:31:30.114 "trtype": "PCIe", 00:31:30.114 "traddr": "0000:5e:00.0" 00:31:30.114 }, 00:31:30.114 "ctrlr_data": { 00:31:30.114 "cntlid": 0, 00:31:30.114 "vendor_id": "0x8086", 00:31:30.114 "model_number": "INTEL SSDPF2KX076TZO", 00:31:30.114 "serial_number": "PHAC0301002G7P6CGN", 00:31:30.114 "firmware_revision": "JCV10200", 00:31:30.114 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:30.114 "oacs": { 00:31:30.114 "security": 1, 00:31:30.114 "format": 1, 00:31:30.114 "firmware": 1, 00:31:30.114 "ns_manage": 1 00:31:30.114 }, 00:31:30.114 "multi_ctrlr": false, 00:31:30.114 "ana_reporting": false 00:31:30.114 }, 00:31:30.114 "vs": { 00:31:30.114 "nvme_version": "1.3" 00:31:30.114 }, 00:31:30.114 "ns_data": { 00:31:30.114 "id": 1, 00:31:30.114 "can_share": false 00:31:30.114 }, 00:31:30.114 "security": { 00:31:30.114 "opal": true 00:31:30.114 } 00:31:30.114 } 00:31:30.114 ], 00:31:30.114 "mp_policy": "active_passive" 00:31:30.114 } 00:31:30.114 } 00:31:30.114 ] 00:31:30.114 13:57:18 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:30.114 13:57:18 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:32.641 1eb304f1-d297-4b9f-9efc-c30ba1b1e0ef 00:31:32.641 13:57:21 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:32.900 eb4b5ca7-7408-41b3-b3de-059f3d0b223e 00:31:32.900 13:57:21 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:32.900 13:57:21 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:32.900 13:57:21 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:32.900 13:57:21 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:32.900 13:57:21 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:32.900 13:57:21 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:32.900 13:57:21 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:33.159 13:57:21 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:33.726 [ 00:31:33.726 { 00:31:33.726 "name": "eb4b5ca7-7408-41b3-b3de-059f3d0b223e", 00:31:33.726 "aliases": [ 00:31:33.726 "lvs0/lv0" 00:31:33.726 ], 00:31:33.726 "product_name": "Logical Volume", 00:31:33.726 "block_size": 512, 00:31:33.726 "num_blocks": 204800, 00:31:33.726 "uuid": "eb4b5ca7-7408-41b3-b3de-059f3d0b223e", 00:31:33.726 "assigned_rate_limits": { 00:31:33.726 "rw_ios_per_sec": 0, 00:31:33.726 "rw_mbytes_per_sec": 0, 00:31:33.726 "r_mbytes_per_sec": 0, 00:31:33.726 "w_mbytes_per_sec": 0 00:31:33.726 }, 00:31:33.726 "claimed": false, 00:31:33.726 "zoned": false, 00:31:33.726 "supported_io_types": { 00:31:33.726 "read": true, 00:31:33.726 "write": true, 00:31:33.726 "unmap": true, 00:31:33.726 "flush": false, 00:31:33.726 "reset": true, 00:31:33.726 "nvme_admin": false, 00:31:33.726 "nvme_io": false, 00:31:33.726 "nvme_io_md": false, 00:31:33.726 "write_zeroes": true, 00:31:33.726 "zcopy": false, 00:31:33.726 "get_zone_info": false, 00:31:33.726 "zone_management": false, 00:31:33.726 "zone_append": false, 00:31:33.726 "compare": false, 00:31:33.726 "compare_and_write": false, 00:31:33.726 "abort": false, 00:31:33.726 "seek_hole": true, 00:31:33.726 "seek_data": true, 00:31:33.726 "copy": false, 00:31:33.726 "nvme_iov_md": false 00:31:33.726 }, 00:31:33.726 "driver_specific": { 00:31:33.726 "lvol": { 00:31:33.726 "lvol_store_uuid": "1eb304f1-d297-4b9f-9efc-c30ba1b1e0ef", 00:31:33.726 "base_bdev": "Nvme0n1", 00:31:33.726 "thin_provision": true, 00:31:33.726 "num_allocated_clusters": 0, 00:31:33.726 "snapshot": false, 00:31:33.726 "clone": false, 00:31:33.726 "esnap_clone": false 00:31:33.726 } 00:31:33.726 } 00:31:33.726 } 00:31:33.726 ] 00:31:33.726 13:57:22 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:33.726 13:57:22 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:31:33.726 13:57:22 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:31:33.986 [2024-07-12 13:57:22.355972] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:33.986 COMP_lvs0/lv0 00:31:33.986 13:57:22 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:33.986 13:57:22 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:33.986 13:57:22 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:33.986 13:57:22 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:33.986 13:57:22 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:33.986 13:57:22 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:33.986 13:57:22 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:34.245 13:57:22 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:34.813 [ 00:31:34.813 { 00:31:34.813 "name": "COMP_lvs0/lv0", 00:31:34.813 "aliases": [ 00:31:34.813 "e762b554-4585-556f-bb43-18c87ff487d1" 00:31:34.813 ], 00:31:34.813 "product_name": "compress", 00:31:34.813 "block_size": 4096, 00:31:34.813 "num_blocks": 25088, 00:31:34.813 "uuid": "e762b554-4585-556f-bb43-18c87ff487d1", 00:31:34.813 "assigned_rate_limits": { 00:31:34.813 "rw_ios_per_sec": 0, 00:31:34.813 "rw_mbytes_per_sec": 0, 00:31:34.813 "r_mbytes_per_sec": 0, 00:31:34.813 "w_mbytes_per_sec": 0 00:31:34.813 }, 00:31:34.813 "claimed": false, 00:31:34.813 "zoned": false, 00:31:34.813 "supported_io_types": { 00:31:34.813 "read": true, 00:31:34.813 "write": true, 00:31:34.813 "unmap": false, 00:31:34.813 "flush": false, 00:31:34.813 "reset": false, 00:31:34.813 "nvme_admin": false, 00:31:34.813 "nvme_io": false, 00:31:34.813 "nvme_io_md": false, 00:31:34.813 "write_zeroes": true, 00:31:34.813 "zcopy": false, 00:31:34.813 "get_zone_info": false, 00:31:34.813 "zone_management": false, 00:31:34.813 "zone_append": false, 00:31:34.813 "compare": false, 00:31:34.813 "compare_and_write": false, 00:31:34.813 "abort": false, 00:31:34.813 "seek_hole": false, 00:31:34.813 "seek_data": false, 00:31:34.813 "copy": false, 00:31:34.813 "nvme_iov_md": false 00:31:34.813 }, 00:31:34.813 "driver_specific": { 00:31:34.813 "compress": { 00:31:34.813 "name": "COMP_lvs0/lv0", 00:31:34.813 "base_bdev_name": "eb4b5ca7-7408-41b3-b3de-059f3d0b223e" 00:31:34.813 } 00:31:34.813 } 00:31:34.813 } 00:31:34.813 ] 00:31:34.813 13:57:23 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:34.813 13:57:23 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:34.813 Running I/O for 3 seconds... 00:31:38.101 00:31:38.101 Latency(us) 00:31:38.101 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:38.101 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:38.101 Verification LBA range: start 0x0 length 0x3100 00:31:38.101 COMP_lvs0/lv0 : 3.01 1281.94 5.01 0.00 0.00 24853.96 2080.06 21541.40 00:31:38.101 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:38.101 Verification LBA range: start 0x3100 length 0x3100 00:31:38.101 COMP_lvs0/lv0 : 3.01 1283.27 5.01 0.00 0.00 24795.82 1510.18 19831.76 00:31:38.101 =================================================================================================================== 00:31:38.101 Total : 2565.21 10.02 0.00 0.00 24824.87 1510.18 21541.40 00:31:38.101 0 00:31:38.101 13:57:26 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:38.101 13:57:26 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:38.361 13:57:26 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:38.620 13:57:27 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:38.620 13:57:27 compress_isal -- compress/compress.sh@78 -- # killprocess 611407 00:31:38.620 13:57:27 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 611407 ']' 00:31:38.620 13:57:27 compress_isal -- common/autotest_common.sh@952 -- # kill -0 611407 00:31:38.620 13:57:27 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:38.620 13:57:27 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:38.620 13:57:27 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 611407 00:31:38.620 13:57:27 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:38.620 13:57:27 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:38.620 13:57:27 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 611407' 00:31:38.620 killing process with pid 611407 00:31:38.620 13:57:27 compress_isal -- common/autotest_common.sh@967 -- # kill 611407 00:31:38.620 Received shutdown signal, test time was about 3.000000 seconds 00:31:38.620 00:31:38.620 Latency(us) 00:31:38.620 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:38.620 =================================================================================================================== 00:31:38.620 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:38.620 13:57:27 compress_isal -- common/autotest_common.sh@972 -- # wait 611407 00:31:41.909 13:57:30 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:31:41.909 13:57:30 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:41.909 13:57:30 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=613246 00:31:41.910 13:57:30 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:41.910 13:57:30 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:31:41.910 13:57:30 compress_isal -- compress/compress.sh@57 -- # waitforlisten 613246 00:31:41.910 13:57:30 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 613246 ']' 00:31:41.910 13:57:30 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:41.910 13:57:30 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:41.910 13:57:30 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:41.910 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:41.910 13:57:30 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:41.910 13:57:30 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:41.910 [2024-07-12 13:57:30.276022] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:31:41.910 [2024-07-12 13:57:30.276160] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid613246 ] 00:31:41.910 [2024-07-12 13:57:30.470398] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:42.169 [2024-07-12 13:57:30.570833] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:42.169 [2024-07-12 13:57:30.570938] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:42.169 [2024-07-12 13:57:30.570948] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:42.736 13:57:31 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:42.736 13:57:31 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:42.736 13:57:31 compress_isal -- compress/compress.sh@58 -- # create_vols 00:31:42.736 13:57:31 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:42.736 13:57:31 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:43.305 13:57:31 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:43.305 13:57:31 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:43.305 13:57:31 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:43.305 13:57:31 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:43.305 13:57:31 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:43.305 13:57:31 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:43.305 13:57:31 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:43.564 13:57:31 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:43.823 [ 00:31:43.823 { 00:31:43.823 "name": "Nvme0n1", 00:31:43.823 "aliases": [ 00:31:43.823 "01000000-0000-0000-5cd2-e43197705251" 00:31:43.823 ], 00:31:43.823 "product_name": "NVMe disk", 00:31:43.823 "block_size": 512, 00:31:43.823 "num_blocks": 15002931888, 00:31:43.823 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:43.823 "assigned_rate_limits": { 00:31:43.823 "rw_ios_per_sec": 0, 00:31:43.823 "rw_mbytes_per_sec": 0, 00:31:43.823 "r_mbytes_per_sec": 0, 00:31:43.823 "w_mbytes_per_sec": 0 00:31:43.823 }, 00:31:43.823 "claimed": false, 00:31:43.823 "zoned": false, 00:31:43.823 "supported_io_types": { 00:31:43.823 "read": true, 00:31:43.823 "write": true, 00:31:43.823 "unmap": true, 00:31:43.823 "flush": true, 00:31:43.823 "reset": true, 00:31:43.823 "nvme_admin": true, 00:31:43.823 "nvme_io": true, 00:31:43.823 "nvme_io_md": false, 00:31:43.823 "write_zeroes": true, 00:31:43.823 "zcopy": false, 00:31:43.823 "get_zone_info": false, 00:31:43.823 "zone_management": false, 00:31:43.823 "zone_append": false, 00:31:43.823 "compare": false, 00:31:43.823 "compare_and_write": false, 00:31:43.823 "abort": true, 00:31:43.823 "seek_hole": false, 00:31:43.823 "seek_data": false, 00:31:43.823 "copy": false, 00:31:43.823 "nvme_iov_md": false 00:31:43.823 }, 00:31:43.823 "driver_specific": { 00:31:43.823 "nvme": [ 00:31:43.823 { 00:31:43.823 "pci_address": "0000:5e:00.0", 00:31:43.823 "trid": { 00:31:43.823 "trtype": "PCIe", 00:31:43.823 "traddr": "0000:5e:00.0" 00:31:43.823 }, 00:31:43.823 "ctrlr_data": { 00:31:43.823 "cntlid": 0, 00:31:43.823 "vendor_id": "0x8086", 00:31:43.823 "model_number": "INTEL SSDPF2KX076TZO", 00:31:43.823 "serial_number": "PHAC0301002G7P6CGN", 00:31:43.823 "firmware_revision": "JCV10200", 00:31:43.823 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:43.823 "oacs": { 00:31:43.823 "security": 1, 00:31:43.823 "format": 1, 00:31:43.823 "firmware": 1, 00:31:43.823 "ns_manage": 1 00:31:43.823 }, 00:31:43.823 "multi_ctrlr": false, 00:31:43.823 "ana_reporting": false 00:31:43.823 }, 00:31:43.823 "vs": { 00:31:43.823 "nvme_version": "1.3" 00:31:43.823 }, 00:31:43.823 "ns_data": { 00:31:43.823 "id": 1, 00:31:43.823 "can_share": false 00:31:43.823 }, 00:31:43.823 "security": { 00:31:43.823 "opal": true 00:31:43.823 } 00:31:43.823 } 00:31:43.823 ], 00:31:43.823 "mp_policy": "active_passive" 00:31:43.823 } 00:31:43.823 } 00:31:43.823 ] 00:31:43.823 13:57:32 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:43.823 13:57:32 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:46.357 a7e0d381-9fdd-4af2-a479-d6192f3596b4 00:31:46.357 13:57:34 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:46.616 58c1dbfe-b254-403d-b903-35556a074ba7 00:31:46.616 13:57:34 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:46.616 13:57:34 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:46.616 13:57:34 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:46.616 13:57:34 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:46.616 13:57:34 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:46.616 13:57:34 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:46.616 13:57:34 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:46.875 13:57:35 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:47.134 [ 00:31:47.134 { 00:31:47.134 "name": "58c1dbfe-b254-403d-b903-35556a074ba7", 00:31:47.134 "aliases": [ 00:31:47.134 "lvs0/lv0" 00:31:47.134 ], 00:31:47.134 "product_name": "Logical Volume", 00:31:47.134 "block_size": 512, 00:31:47.134 "num_blocks": 204800, 00:31:47.134 "uuid": "58c1dbfe-b254-403d-b903-35556a074ba7", 00:31:47.134 "assigned_rate_limits": { 00:31:47.134 "rw_ios_per_sec": 0, 00:31:47.134 "rw_mbytes_per_sec": 0, 00:31:47.134 "r_mbytes_per_sec": 0, 00:31:47.134 "w_mbytes_per_sec": 0 00:31:47.134 }, 00:31:47.134 "claimed": false, 00:31:47.134 "zoned": false, 00:31:47.134 "supported_io_types": { 00:31:47.134 "read": true, 00:31:47.134 "write": true, 00:31:47.134 "unmap": true, 00:31:47.134 "flush": false, 00:31:47.134 "reset": true, 00:31:47.134 "nvme_admin": false, 00:31:47.134 "nvme_io": false, 00:31:47.134 "nvme_io_md": false, 00:31:47.134 "write_zeroes": true, 00:31:47.134 "zcopy": false, 00:31:47.134 "get_zone_info": false, 00:31:47.134 "zone_management": false, 00:31:47.134 "zone_append": false, 00:31:47.134 "compare": false, 00:31:47.134 "compare_and_write": false, 00:31:47.134 "abort": false, 00:31:47.134 "seek_hole": true, 00:31:47.134 "seek_data": true, 00:31:47.134 "copy": false, 00:31:47.134 "nvme_iov_md": false 00:31:47.134 }, 00:31:47.134 "driver_specific": { 00:31:47.134 "lvol": { 00:31:47.134 "lvol_store_uuid": "a7e0d381-9fdd-4af2-a479-d6192f3596b4", 00:31:47.134 "base_bdev": "Nvme0n1", 00:31:47.134 "thin_provision": true, 00:31:47.134 "num_allocated_clusters": 0, 00:31:47.134 "snapshot": false, 00:31:47.134 "clone": false, 00:31:47.134 "esnap_clone": false 00:31:47.134 } 00:31:47.134 } 00:31:47.134 } 00:31:47.134 ] 00:31:47.134 13:57:35 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:47.134 13:57:35 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:47.134 13:57:35 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:47.393 [2024-07-12 13:57:35.726334] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:47.393 COMP_lvs0/lv0 00:31:47.393 13:57:35 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:47.393 13:57:35 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:47.393 13:57:35 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:47.393 13:57:35 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:47.393 13:57:35 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:47.393 13:57:35 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:47.393 13:57:35 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:47.652 13:57:36 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:47.911 [ 00:31:47.911 { 00:31:47.911 "name": "COMP_lvs0/lv0", 00:31:47.911 "aliases": [ 00:31:47.911 "1b194562-8f31-5aea-bd79-548eb7419240" 00:31:47.911 ], 00:31:47.911 "product_name": "compress", 00:31:47.911 "block_size": 512, 00:31:47.911 "num_blocks": 200704, 00:31:47.911 "uuid": "1b194562-8f31-5aea-bd79-548eb7419240", 00:31:47.911 "assigned_rate_limits": { 00:31:47.911 "rw_ios_per_sec": 0, 00:31:47.911 "rw_mbytes_per_sec": 0, 00:31:47.911 "r_mbytes_per_sec": 0, 00:31:47.911 "w_mbytes_per_sec": 0 00:31:47.911 }, 00:31:47.911 "claimed": false, 00:31:47.911 "zoned": false, 00:31:47.911 "supported_io_types": { 00:31:47.911 "read": true, 00:31:47.911 "write": true, 00:31:47.911 "unmap": false, 00:31:47.911 "flush": false, 00:31:47.911 "reset": false, 00:31:47.911 "nvme_admin": false, 00:31:47.911 "nvme_io": false, 00:31:47.911 "nvme_io_md": false, 00:31:47.911 "write_zeroes": true, 00:31:47.911 "zcopy": false, 00:31:47.911 "get_zone_info": false, 00:31:47.911 "zone_management": false, 00:31:47.911 "zone_append": false, 00:31:47.911 "compare": false, 00:31:47.911 "compare_and_write": false, 00:31:47.911 "abort": false, 00:31:47.911 "seek_hole": false, 00:31:47.911 "seek_data": false, 00:31:47.911 "copy": false, 00:31:47.911 "nvme_iov_md": false 00:31:47.911 }, 00:31:47.911 "driver_specific": { 00:31:47.911 "compress": { 00:31:47.911 "name": "COMP_lvs0/lv0", 00:31:47.911 "base_bdev_name": "58c1dbfe-b254-403d-b903-35556a074ba7" 00:31:47.911 } 00:31:47.911 } 00:31:47.911 } 00:31:47.911 ] 00:31:47.911 13:57:36 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:47.911 13:57:36 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:47.911 I/O targets: 00:31:47.911 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:47.911 00:31:47.911 00:31:47.911 CUnit - A unit testing framework for C - Version 2.1-3 00:31:47.911 http://cunit.sourceforge.net/ 00:31:47.911 00:31:47.911 00:31:47.911 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:47.911 Test: blockdev write read block ...passed 00:31:47.911 Test: blockdev write zeroes read block ...passed 00:31:47.911 Test: blockdev write zeroes read no split ...passed 00:31:47.911 Test: blockdev write zeroes read split ...passed 00:31:48.170 Test: blockdev write zeroes read split partial ...passed 00:31:48.170 Test: blockdev reset ...[2024-07-12 13:57:36.522506] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:48.170 passed 00:31:48.170 Test: blockdev write read 8 blocks ...passed 00:31:48.170 Test: blockdev write read size > 128k ...passed 00:31:48.170 Test: blockdev write read invalid size ...passed 00:31:48.170 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:48.170 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:48.170 Test: blockdev write read max offset ...passed 00:31:48.170 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:48.170 Test: blockdev writev readv 8 blocks ...passed 00:31:48.170 Test: blockdev writev readv 30 x 1block ...passed 00:31:48.170 Test: blockdev writev readv block ...passed 00:31:48.170 Test: blockdev writev readv size > 128k ...passed 00:31:48.170 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:48.170 Test: blockdev comparev and writev ...passed 00:31:48.170 Test: blockdev nvme passthru rw ...passed 00:31:48.170 Test: blockdev nvme passthru vendor specific ...passed 00:31:48.170 Test: blockdev nvme admin passthru ...passed 00:31:48.170 Test: blockdev copy ...passed 00:31:48.170 00:31:48.170 Run Summary: Type Total Ran Passed Failed Inactive 00:31:48.170 suites 1 1 n/a 0 0 00:31:48.170 tests 23 23 23 0 0 00:31:48.170 asserts 130 130 130 0 n/a 00:31:48.170 00:31:48.170 Elapsed time = 0.291 seconds 00:31:48.170 0 00:31:48.170 13:57:36 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:31:48.170 13:57:36 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:48.429 13:57:36 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:48.688 13:57:37 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:48.688 13:57:37 compress_isal -- compress/compress.sh@62 -- # killprocess 613246 00:31:48.688 13:57:37 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 613246 ']' 00:31:48.688 13:57:37 compress_isal -- common/autotest_common.sh@952 -- # kill -0 613246 00:31:48.688 13:57:37 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:48.688 13:57:37 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:48.688 13:57:37 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 613246 00:31:48.688 13:57:37 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:48.688 13:57:37 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:48.688 13:57:37 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 613246' 00:31:48.688 killing process with pid 613246 00:31:48.688 13:57:37 compress_isal -- common/autotest_common.sh@967 -- # kill 613246 00:31:48.688 13:57:37 compress_isal -- common/autotest_common.sh@972 -- # wait 613246 00:31:51.976 13:57:39 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:51.976 13:57:39 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:51.976 00:31:51.976 real 0m50.922s 00:31:51.976 user 2m0.565s 00:31:51.976 sys 0m4.964s 00:31:51.976 13:57:39 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:51.976 13:57:39 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:51.976 ************************************ 00:31:51.976 END TEST compress_isal 00:31:51.976 ************************************ 00:31:51.976 13:57:39 -- common/autotest_common.sh@1142 -- # return 0 00:31:51.976 13:57:39 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:31:51.976 13:57:39 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:31:51.976 13:57:39 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:51.976 13:57:39 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:51.976 13:57:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:51.976 13:57:39 -- common/autotest_common.sh@10 -- # set +x 00:31:51.976 ************************************ 00:31:51.976 START TEST blockdev_crypto_aesni 00:31:51.976 ************************************ 00:31:51.976 13:57:39 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:51.976 * Looking for test storage... 00:31:51.976 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=614543 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:51.976 13:57:40 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 614543 00:31:51.976 13:57:40 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 614543 ']' 00:31:51.976 13:57:40 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:51.976 13:57:40 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:51.976 13:57:40 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:51.976 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:51.976 13:57:40 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:51.976 13:57:40 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:51.976 [2024-07-12 13:57:40.170126] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:31:51.976 [2024-07-12 13:57:40.170182] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid614543 ] 00:31:51.976 [2024-07-12 13:57:40.294259] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:51.976 [2024-07-12 13:57:40.435620] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:52.913 13:57:41 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:52.913 13:57:41 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:31:52.913 13:57:41 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:31:52.913 13:57:41 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:31:52.913 13:57:41 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:31:52.913 13:57:41 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:52.913 13:57:41 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:52.913 [2024-07-12 13:57:41.154037] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:52.913 [2024-07-12 13:57:41.162071] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:52.913 [2024-07-12 13:57:41.170086] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:52.913 [2024-07-12 13:57:41.235940] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:55.449 true 00:31:55.449 true 00:31:55.449 true 00:31:55.449 true 00:31:55.449 Malloc0 00:31:55.449 Malloc1 00:31:55.449 Malloc2 00:31:55.449 Malloc3 00:31:55.449 [2024-07-12 13:57:43.660128] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:55.449 crypto_ram 00:31:55.449 [2024-07-12 13:57:43.668140] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:55.449 crypto_ram2 00:31:55.449 [2024-07-12 13:57:43.676161] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:55.449 crypto_ram3 00:31:55.449 [2024-07-12 13:57:43.684184] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:55.449 crypto_ram4 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "974c9983-2676-5357-988b-0dd062ba5aa4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "974c9983-2676-5357-988b-0dd062ba5aa4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "8ba0ccfd-265b-5509-82de-57cd466a3c83"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8ba0ccfd-265b-5509-82de-57cd466a3c83",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "3419be7c-1d57-5966-ae9b-e7071a1b5244"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3419be7c-1d57-5966-ae9b-e7071a1b5244",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "d3fa17a2-9c6b-5e58-b87c-5a18629f6f62"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d3fa17a2-9c6b-5e58-b87c-5a18629f6f62",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:31:55.449 13:57:43 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 614543 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 614543 ']' 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 614543 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 614543 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:55.449 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:55.450 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 614543' 00:31:55.450 killing process with pid 614543 00:31:55.450 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 614543 00:31:55.450 13:57:43 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 614543 00:31:56.017 13:57:44 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:56.017 13:57:44 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:56.017 13:57:44 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:31:56.017 13:57:44 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:56.017 13:57:44 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:56.276 ************************************ 00:31:56.276 START TEST bdev_hello_world 00:31:56.276 ************************************ 00:31:56.276 13:57:44 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:56.276 [2024-07-12 13:57:44.679108] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:31:56.276 [2024-07-12 13:57:44.679160] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid615093 ] 00:31:56.276 [2024-07-12 13:57:44.792934] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:56.535 [2024-07-12 13:57:44.899495] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:56.535 [2024-07-12 13:57:44.920784] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:56.535 [2024-07-12 13:57:44.928810] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:56.535 [2024-07-12 13:57:44.936836] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:56.535 [2024-07-12 13:57:45.050082] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:59.069 [2024-07-12 13:57:47.276291] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:59.069 [2024-07-12 13:57:47.276376] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:59.069 [2024-07-12 13:57:47.276392] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:59.069 [2024-07-12 13:57:47.284309] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:59.069 [2024-07-12 13:57:47.284329] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:59.069 [2024-07-12 13:57:47.284340] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:59.069 [2024-07-12 13:57:47.292330] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:59.069 [2024-07-12 13:57:47.292348] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:59.069 [2024-07-12 13:57:47.292360] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:59.069 [2024-07-12 13:57:47.300350] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:59.069 [2024-07-12 13:57:47.300367] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:59.069 [2024-07-12 13:57:47.300379] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:59.069 [2024-07-12 13:57:47.373123] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:31:59.069 [2024-07-12 13:57:47.373168] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:31:59.069 [2024-07-12 13:57:47.373187] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:31:59.069 [2024-07-12 13:57:47.374454] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:31:59.069 [2024-07-12 13:57:47.374530] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:31:59.069 [2024-07-12 13:57:47.374548] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:31:59.069 [2024-07-12 13:57:47.374594] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:31:59.069 00:31:59.069 [2024-07-12 13:57:47.374613] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:31:59.328 00:31:59.328 real 0m3.123s 00:31:59.328 user 0m2.715s 00:31:59.328 sys 0m0.370s 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:31:59.328 ************************************ 00:31:59.328 END TEST bdev_hello_world 00:31:59.328 ************************************ 00:31:59.328 13:57:47 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:59.328 13:57:47 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:31:59.328 13:57:47 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:59.328 13:57:47 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:59.328 13:57:47 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:59.328 ************************************ 00:31:59.328 START TEST bdev_bounds 00:31:59.328 ************************************ 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=615548 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 615548' 00:31:59.328 Process bdevio pid: 615548 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 615548 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 615548 ']' 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:59.328 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:59.328 13:57:47 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:59.328 [2024-07-12 13:57:47.893239] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:31:59.328 [2024-07-12 13:57:47.893309] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid615548 ] 00:31:59.587 [2024-07-12 13:57:48.025498] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:59.587 [2024-07-12 13:57:48.129512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:59.587 [2024-07-12 13:57:48.129611] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:59.587 [2024-07-12 13:57:48.129614] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:59.587 [2024-07-12 13:57:48.151000] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:59.587 [2024-07-12 13:57:48.159021] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:59.587 [2024-07-12 13:57:48.167047] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:59.845 [2024-07-12 13:57:48.275652] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:02.385 [2024-07-12 13:57:50.510899] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:02.385 [2024-07-12 13:57:50.510999] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:02.385 [2024-07-12 13:57:50.511016] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.385 [2024-07-12 13:57:50.518910] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:02.385 [2024-07-12 13:57:50.518935] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:02.385 [2024-07-12 13:57:50.518948] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.385 [2024-07-12 13:57:50.526941] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:02.385 [2024-07-12 13:57:50.526959] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:02.385 [2024-07-12 13:57:50.526970] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.385 [2024-07-12 13:57:50.534964] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:02.385 [2024-07-12 13:57:50.534982] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:02.385 [2024-07-12 13:57:50.534993] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:02.385 13:57:50 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:02.385 13:57:50 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:02.385 13:57:50 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:02.385 I/O targets: 00:32:02.385 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:32:02.385 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:32:02.385 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:32:02.385 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:32:02.385 00:32:02.385 00:32:02.385 CUnit - A unit testing framework for C - Version 2.1-3 00:32:02.385 http://cunit.sourceforge.net/ 00:32:02.385 00:32:02.385 00:32:02.385 Suite: bdevio tests on: crypto_ram4 00:32:02.385 Test: blockdev write read block ...passed 00:32:02.385 Test: blockdev write zeroes read block ...passed 00:32:02.385 Test: blockdev write zeroes read no split ...passed 00:32:02.385 Test: blockdev write zeroes read split ...passed 00:32:02.385 Test: blockdev write zeroes read split partial ...passed 00:32:02.385 Test: blockdev reset ...passed 00:32:02.385 Test: blockdev write read 8 blocks ...passed 00:32:02.385 Test: blockdev write read size > 128k ...passed 00:32:02.385 Test: blockdev write read invalid size ...passed 00:32:02.385 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:02.385 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:02.385 Test: blockdev write read max offset ...passed 00:32:02.385 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:02.385 Test: blockdev writev readv 8 blocks ...passed 00:32:02.385 Test: blockdev writev readv 30 x 1block ...passed 00:32:02.385 Test: blockdev writev readv block ...passed 00:32:02.386 Test: blockdev writev readv size > 128k ...passed 00:32:02.386 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:02.386 Test: blockdev comparev and writev ...passed 00:32:02.386 Test: blockdev nvme passthru rw ...passed 00:32:02.386 Test: blockdev nvme passthru vendor specific ...passed 00:32:02.386 Test: blockdev nvme admin passthru ...passed 00:32:02.386 Test: blockdev copy ...passed 00:32:02.386 Suite: bdevio tests on: crypto_ram3 00:32:02.386 Test: blockdev write read block ...passed 00:32:02.386 Test: blockdev write zeroes read block ...passed 00:32:02.386 Test: blockdev write zeroes read no split ...passed 00:32:02.386 Test: blockdev write zeroes read split ...passed 00:32:02.386 Test: blockdev write zeroes read split partial ...passed 00:32:02.386 Test: blockdev reset ...passed 00:32:02.386 Test: blockdev write read 8 blocks ...passed 00:32:02.386 Test: blockdev write read size > 128k ...passed 00:32:02.386 Test: blockdev write read invalid size ...passed 00:32:02.386 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:02.386 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:02.386 Test: blockdev write read max offset ...passed 00:32:02.386 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:02.386 Test: blockdev writev readv 8 blocks ...passed 00:32:02.386 Test: blockdev writev readv 30 x 1block ...passed 00:32:02.386 Test: blockdev writev readv block ...passed 00:32:02.386 Test: blockdev writev readv size > 128k ...passed 00:32:02.386 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:02.386 Test: blockdev comparev and writev ...passed 00:32:02.386 Test: blockdev nvme passthru rw ...passed 00:32:02.386 Test: blockdev nvme passthru vendor specific ...passed 00:32:02.386 Test: blockdev nvme admin passthru ...passed 00:32:02.386 Test: blockdev copy ...passed 00:32:02.386 Suite: bdevio tests on: crypto_ram2 00:32:02.386 Test: blockdev write read block ...passed 00:32:02.386 Test: blockdev write zeroes read block ...passed 00:32:02.386 Test: blockdev write zeroes read no split ...passed 00:32:02.644 Test: blockdev write zeroes read split ...passed 00:32:02.644 Test: blockdev write zeroes read split partial ...passed 00:32:02.644 Test: blockdev reset ...passed 00:32:02.644 Test: blockdev write read 8 blocks ...passed 00:32:02.901 Test: blockdev write read size > 128k ...passed 00:32:02.901 Test: blockdev write read invalid size ...passed 00:32:02.901 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:02.901 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:02.901 Test: blockdev write read max offset ...passed 00:32:02.901 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:02.901 Test: blockdev writev readv 8 blocks ...passed 00:32:02.901 Test: blockdev writev readv 30 x 1block ...passed 00:32:02.901 Test: blockdev writev readv block ...passed 00:32:02.901 Test: blockdev writev readv size > 128k ...passed 00:32:02.901 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:02.901 Test: blockdev comparev and writev ...passed 00:32:02.901 Test: blockdev nvme passthru rw ...passed 00:32:02.901 Test: blockdev nvme passthru vendor specific ...passed 00:32:02.901 Test: blockdev nvme admin passthru ...passed 00:32:02.901 Test: blockdev copy ...passed 00:32:02.901 Suite: bdevio tests on: crypto_ram 00:32:02.901 Test: blockdev write read block ...passed 00:32:02.901 Test: blockdev write zeroes read block ...passed 00:32:02.901 Test: blockdev write zeroes read no split ...passed 00:32:02.901 Test: blockdev write zeroes read split ...passed 00:32:03.187 Test: blockdev write zeroes read split partial ...passed 00:32:03.187 Test: blockdev reset ...passed 00:32:03.187 Test: blockdev write read 8 blocks ...passed 00:32:03.187 Test: blockdev write read size > 128k ...passed 00:32:03.187 Test: blockdev write read invalid size ...passed 00:32:03.187 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:03.187 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:03.187 Test: blockdev write read max offset ...passed 00:32:03.187 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:03.187 Test: blockdev writev readv 8 blocks ...passed 00:32:03.187 Test: blockdev writev readv 30 x 1block ...passed 00:32:03.187 Test: blockdev writev readv block ...passed 00:32:03.187 Test: blockdev writev readv size > 128k ...passed 00:32:03.187 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:03.187 Test: blockdev comparev and writev ...passed 00:32:03.188 Test: blockdev nvme passthru rw ...passed 00:32:03.188 Test: blockdev nvme passthru vendor specific ...passed 00:32:03.188 Test: blockdev nvme admin passthru ...passed 00:32:03.188 Test: blockdev copy ...passed 00:32:03.188 00:32:03.188 Run Summary: Type Total Ran Passed Failed Inactive 00:32:03.188 suites 4 4 n/a 0 0 00:32:03.188 tests 92 92 92 0 0 00:32:03.188 asserts 520 520 520 0 n/a 00:32:03.188 00:32:03.188 Elapsed time = 1.641 seconds 00:32:03.188 0 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 615548 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 615548 ']' 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 615548 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 615548 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 615548' 00:32:03.188 killing process with pid 615548 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 615548 00:32:03.188 13:57:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 615548 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:03.754 00:32:03.754 real 0m4.211s 00:32:03.754 user 0m11.169s 00:32:03.754 sys 0m0.575s 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:03.754 ************************************ 00:32:03.754 END TEST bdev_bounds 00:32:03.754 ************************************ 00:32:03.754 13:57:52 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:03.754 13:57:52 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:03.754 13:57:52 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:03.754 13:57:52 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:03.754 13:57:52 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:03.754 ************************************ 00:32:03.754 START TEST bdev_nbd 00:32:03.754 ************************************ 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=616156 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 616156 /var/tmp/spdk-nbd.sock 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 616156 ']' 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:03.754 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:03.754 13:57:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:03.754 [2024-07-12 13:57:52.198434] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:32:03.754 [2024-07-12 13:57:52.198504] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:03.754 [2024-07-12 13:57:52.328637] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:04.012 [2024-07-12 13:57:52.431308] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:04.012 [2024-07-12 13:57:52.452648] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:04.012 [2024-07-12 13:57:52.460670] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:04.012 [2024-07-12 13:57:52.468690] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:04.012 [2024-07-12 13:57:52.577873] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:06.542 [2024-07-12 13:57:54.794287] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:06.542 [2024-07-12 13:57:54.794363] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:06.542 [2024-07-12 13:57:54.794379] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:06.542 [2024-07-12 13:57:54.802305] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:06.542 [2024-07-12 13:57:54.802326] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:06.542 [2024-07-12 13:57:54.802338] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:06.542 [2024-07-12 13:57:54.810326] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:06.542 [2024-07-12 13:57:54.810344] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:06.542 [2024-07-12 13:57:54.810356] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:06.542 [2024-07-12 13:57:54.818346] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:06.542 [2024-07-12 13:57:54.818363] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:06.542 [2024-07-12 13:57:54.818375] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:06.542 13:57:54 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:06.801 1+0 records in 00:32:06.801 1+0 records out 00:32:06.801 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291305 s, 14.1 MB/s 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:06.801 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:07.060 1+0 records in 00:32:07.060 1+0 records out 00:32:07.060 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000339155 s, 12.1 MB/s 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:07.060 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:07.319 1+0 records in 00:32:07.319 1+0 records out 00:32:07.319 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000303481 s, 13.5 MB/s 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:07.319 13:57:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:07.578 1+0 records in 00:32:07.578 1+0 records out 00:32:07.578 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000405563 s, 10.1 MB/s 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:07.578 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:07.837 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:07.837 { 00:32:07.837 "nbd_device": "/dev/nbd0", 00:32:07.837 "bdev_name": "crypto_ram" 00:32:07.837 }, 00:32:07.837 { 00:32:07.837 "nbd_device": "/dev/nbd1", 00:32:07.837 "bdev_name": "crypto_ram2" 00:32:07.837 }, 00:32:07.837 { 00:32:07.837 "nbd_device": "/dev/nbd2", 00:32:07.837 "bdev_name": "crypto_ram3" 00:32:07.837 }, 00:32:07.837 { 00:32:07.837 "nbd_device": "/dev/nbd3", 00:32:07.837 "bdev_name": "crypto_ram4" 00:32:07.837 } 00:32:07.837 ]' 00:32:07.837 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:07.837 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:07.837 { 00:32:07.837 "nbd_device": "/dev/nbd0", 00:32:07.837 "bdev_name": "crypto_ram" 00:32:07.837 }, 00:32:07.837 { 00:32:07.837 "nbd_device": "/dev/nbd1", 00:32:07.837 "bdev_name": "crypto_ram2" 00:32:07.837 }, 00:32:07.837 { 00:32:07.837 "nbd_device": "/dev/nbd2", 00:32:07.837 "bdev_name": "crypto_ram3" 00:32:07.837 }, 00:32:07.837 { 00:32:07.837 "nbd_device": "/dev/nbd3", 00:32:07.837 "bdev_name": "crypto_ram4" 00:32:07.837 } 00:32:07.837 ]' 00:32:07.837 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:07.837 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:08.096 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:08.355 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:08.355 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:08.355 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:08.355 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:08.355 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:08.355 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:08.355 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:08.355 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:08.355 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:08.355 13:57:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:32:08.613 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:32:08.613 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:32:08.613 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:32:08.613 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:08.613 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:08.613 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:32:08.613 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:08.613 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:08.613 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:08.613 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:08.872 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:09.131 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:09.390 /dev/nbd0 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:09.390 1+0 records in 00:32:09.390 1+0 records out 00:32:09.390 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00030733 s, 13.3 MB/s 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:09.390 13:57:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:32:09.649 /dev/nbd1 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:09.649 1+0 records in 00:32:09.649 1+0 records out 00:32:09.649 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228955 s, 17.9 MB/s 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:09.649 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:32:09.908 /dev/nbd10 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:09.908 1+0 records in 00:32:09.908 1+0 records out 00:32:09.908 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311087 s, 13.2 MB/s 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:09.908 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:32:10.166 /dev/nbd11 00:32:10.166 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:32:10.166 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:32:10.166 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:32:10.166 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:10.166 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:10.167 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:10.167 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:32:10.167 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:10.167 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:10.167 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:10.167 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:10.167 1+0 records in 00:32:10.167 1+0 records out 00:32:10.167 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000372817 s, 11.0 MB/s 00:32:10.167 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:10.425 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:10.425 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:10.425 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:10.425 13:57:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:10.425 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:10.425 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:10.425 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:10.425 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:10.425 13:57:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:10.684 { 00:32:10.684 "nbd_device": "/dev/nbd0", 00:32:10.684 "bdev_name": "crypto_ram" 00:32:10.684 }, 00:32:10.684 { 00:32:10.684 "nbd_device": "/dev/nbd1", 00:32:10.684 "bdev_name": "crypto_ram2" 00:32:10.684 }, 00:32:10.684 { 00:32:10.684 "nbd_device": "/dev/nbd10", 00:32:10.684 "bdev_name": "crypto_ram3" 00:32:10.684 }, 00:32:10.684 { 00:32:10.684 "nbd_device": "/dev/nbd11", 00:32:10.684 "bdev_name": "crypto_ram4" 00:32:10.684 } 00:32:10.684 ]' 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:10.684 { 00:32:10.684 "nbd_device": "/dev/nbd0", 00:32:10.684 "bdev_name": "crypto_ram" 00:32:10.684 }, 00:32:10.684 { 00:32:10.684 "nbd_device": "/dev/nbd1", 00:32:10.684 "bdev_name": "crypto_ram2" 00:32:10.684 }, 00:32:10.684 { 00:32:10.684 "nbd_device": "/dev/nbd10", 00:32:10.684 "bdev_name": "crypto_ram3" 00:32:10.684 }, 00:32:10.684 { 00:32:10.684 "nbd_device": "/dev/nbd11", 00:32:10.684 "bdev_name": "crypto_ram4" 00:32:10.684 } 00:32:10.684 ]' 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:10.684 /dev/nbd1 00:32:10.684 /dev/nbd10 00:32:10.684 /dev/nbd11' 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:10.684 /dev/nbd1 00:32:10.684 /dev/nbd10 00:32:10.684 /dev/nbd11' 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:10.684 256+0 records in 00:32:10.684 256+0 records out 00:32:10.684 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114986 s, 91.2 MB/s 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:10.684 256+0 records in 00:32:10.684 256+0 records out 00:32:10.684 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0505765 s, 20.7 MB/s 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:10.684 256+0 records in 00:32:10.684 256+0 records out 00:32:10.684 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0659955 s, 15.9 MB/s 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:10.684 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:32:10.943 256+0 records in 00:32:10.943 256+0 records out 00:32:10.943 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0552075 s, 19.0 MB/s 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:32:10.943 256+0 records in 00:32:10.943 256+0 records out 00:32:10.943 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0562447 s, 18.6 MB/s 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:10.943 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:11.202 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:11.202 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:11.202 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:11.202 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:11.202 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:11.202 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:11.202 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:11.202 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:11.202 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:11.202 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:11.462 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:11.462 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:11.462 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:11.462 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:11.462 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:11.462 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:11.462 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:11.462 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:11.462 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:11.462 13:57:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:32:11.721 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:32:11.721 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:32:11.721 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:32:11.721 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:11.721 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:11.721 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:32:11.721 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:11.721 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:11.721 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:11.721 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:11.979 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:12.239 13:58:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:12.497 malloc_lvol_verify 00:32:12.497 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:12.755 46bd455f-36a6-4106-a9ea-757598f848c6 00:32:12.755 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:13.012 ce47df56-827b-439c-a4cb-4cd75bf07904 00:32:13.012 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:13.269 /dev/nbd0 00:32:13.269 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:13.269 mke2fs 1.46.5 (30-Dec-2021) 00:32:13.269 Discarding device blocks: 0/4096 done 00:32:13.269 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:13.269 00:32:13.269 Allocating group tables: 0/1 done 00:32:13.269 Writing inode tables: 0/1 done 00:32:13.269 Creating journal (1024 blocks): done 00:32:13.269 Writing superblocks and filesystem accounting information: 0/1 done 00:32:13.269 00:32:13.269 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:13.269 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:13.269 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:13.269 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:13.269 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:13.269 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:13.269 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:13.269 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:13.527 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 616156 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 616156 ']' 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 616156 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 616156 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 616156' 00:32:13.528 killing process with pid 616156 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 616156 00:32:13.528 13:58:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 616156 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:14.097 00:32:14.097 real 0m10.290s 00:32:14.097 user 0m13.246s 00:32:14.097 sys 0m4.099s 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:14.097 ************************************ 00:32:14.097 END TEST bdev_nbd 00:32:14.097 ************************************ 00:32:14.097 13:58:02 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:14.097 13:58:02 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:14.097 13:58:02 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:32:14.097 13:58:02 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:32:14.097 13:58:02 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:14.097 13:58:02 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:14.097 13:58:02 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:14.097 13:58:02 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:14.097 ************************************ 00:32:14.097 START TEST bdev_fio 00:32:14.097 ************************************ 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:14.097 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:14.097 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:14.098 ************************************ 00:32:14.098 START TEST bdev_fio_rw_verify 00:32:14.098 ************************************ 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:14.098 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:14.364 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:14.364 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:14.364 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:14.364 13:58:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:14.625 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:14.625 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:14.625 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:14.625 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:14.625 fio-3.35 00:32:14.625 Starting 4 threads 00:32:29.509 00:32:29.509 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=618057: Fri Jul 12 13:58:15 2024 00:32:29.509 read: IOPS=21.9k, BW=85.5MiB/s (89.7MB/s)(855MiB/10001msec) 00:32:29.509 slat (usec): min=16, max=453, avg=60.25, stdev=40.98 00:32:29.509 clat (usec): min=12, max=2654, avg=323.73, stdev=229.73 00:32:29.509 lat (usec): min=40, max=2935, avg=383.99, stdev=255.76 00:32:29.509 clat percentiles (usec): 00:32:29.509 | 50.000th=[ 265], 99.000th=[ 1172], 99.900th=[ 1369], 99.990th=[ 1483], 00:32:29.509 | 99.999th=[ 2507] 00:32:29.509 write: IOPS=24.1k, BW=94.2MiB/s (98.7MB/s)(916MiB/9727msec); 0 zone resets 00:32:29.509 slat (usec): min=23, max=1321, avg=74.01, stdev=40.71 00:32:29.509 clat (usec): min=19, max=2743, avg=399.18, stdev=270.27 00:32:29.509 lat (usec): min=48, max=3045, avg=473.19, stdev=295.21 00:32:29.509 clat percentiles (usec): 00:32:29.509 | 50.000th=[ 343], 99.000th=[ 1418], 99.900th=[ 1647], 99.990th=[ 1827], 00:32:29.509 | 99.999th=[ 2311] 00:32:29.509 bw ( KiB/s): min=77960, max=123816, per=97.63%, avg=94148.79, stdev=2996.14, samples=76 00:32:29.509 iops : min=19490, max=30954, avg=23537.16, stdev=749.01, samples=76 00:32:29.509 lat (usec) : 20=0.01%, 50=0.01%, 100=6.22%, 250=32.55%, 500=41.22% 00:32:29.509 lat (usec) : 750=11.56%, 1000=4.97% 00:32:29.509 lat (msec) : 2=3.48%, 4=0.01% 00:32:29.509 cpu : usr=99.58%, sys=0.01%, ctx=69, majf=0, minf=252 00:32:29.509 IO depths : 1=10.4%, 2=25.5%, 4=51.1%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:29.509 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:29.509 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:29.509 issued rwts: total=219005,234502,0,0 short=0,0,0,0 dropped=0,0,0,0 00:32:29.509 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:29.509 00:32:29.509 Run status group 0 (all jobs): 00:32:29.509 READ: bw=85.5MiB/s (89.7MB/s), 85.5MiB/s-85.5MiB/s (89.7MB/s-89.7MB/s), io=855MiB (897MB), run=10001-10001msec 00:32:29.509 WRITE: bw=94.2MiB/s (98.7MB/s), 94.2MiB/s-94.2MiB/s (98.7MB/s-98.7MB/s), io=916MiB (961MB), run=9727-9727msec 00:32:29.509 00:32:29.509 real 0m13.738s 00:32:29.509 user 0m46.206s 00:32:29.509 sys 0m0.551s 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:32:29.509 ************************************ 00:32:29.509 END TEST bdev_fio_rw_verify 00:32:29.509 ************************************ 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "974c9983-2676-5357-988b-0dd062ba5aa4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "974c9983-2676-5357-988b-0dd062ba5aa4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "8ba0ccfd-265b-5509-82de-57cd466a3c83"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8ba0ccfd-265b-5509-82de-57cd466a3c83",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "3419be7c-1d57-5966-ae9b-e7071a1b5244"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3419be7c-1d57-5966-ae9b-e7071a1b5244",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "d3fa17a2-9c6b-5e58-b87c-5a18629f6f62"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d3fa17a2-9c6b-5e58-b87c-5a18629f6f62",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:32:29.509 crypto_ram2 00:32:29.509 crypto_ram3 00:32:29.509 crypto_ram4 ]] 00:32:29.509 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "974c9983-2676-5357-988b-0dd062ba5aa4"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "974c9983-2676-5357-988b-0dd062ba5aa4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "8ba0ccfd-265b-5509-82de-57cd466a3c83"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8ba0ccfd-265b-5509-82de-57cd466a3c83",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "3419be7c-1d57-5966-ae9b-e7071a1b5244"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3419be7c-1d57-5966-ae9b-e7071a1b5244",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "d3fa17a2-9c6b-5e58-b87c-5a18629f6f62"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d3fa17a2-9c6b-5e58-b87c-5a18629f6f62",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:29.510 ************************************ 00:32:29.510 START TEST bdev_fio_trim 00:32:29.510 ************************************ 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:29.510 13:58:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:29.510 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:29.510 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:29.510 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:29.510 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:29.510 fio-3.35 00:32:29.510 Starting 4 threads 00:32:41.713 00:32:41.713 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=619921: Fri Jul 12 13:58:29 2024 00:32:41.713 write: IOPS=37.0k, BW=144MiB/s (151MB/s)(1445MiB/10001msec); 0 zone resets 00:32:41.713 slat (usec): min=11, max=498, avg=62.18, stdev=36.93 00:32:41.713 clat (usec): min=33, max=2033, avg=273.22, stdev=178.30 00:32:41.713 lat (usec): min=53, max=2353, avg=335.40, stdev=200.84 00:32:41.713 clat percentiles (usec): 00:32:41.713 | 50.000th=[ 229], 99.000th=[ 898], 99.900th=[ 1074], 99.990th=[ 1270], 00:32:41.713 | 99.999th=[ 1926] 00:32:41.713 bw ( KiB/s): min=138016, max=241616, per=100.00%, avg=148558.74, stdev=7358.15, samples=76 00:32:41.713 iops : min=34504, max=60404, avg=37139.68, stdev=1839.54, samples=76 00:32:41.713 trim: IOPS=37.0k, BW=144MiB/s (151MB/s)(1445MiB/10001msec); 0 zone resets 00:32:41.713 slat (usec): min=4, max=1415, avg=17.50, stdev= 7.96 00:32:41.713 clat (usec): min=5, max=1973, avg=258.01, stdev=128.34 00:32:41.713 lat (usec): min=26, max=1991, avg=275.52, stdev=131.40 00:32:41.713 clat percentiles (usec): 00:32:41.713 | 50.000th=[ 235], 99.000th=[ 627], 99.900th=[ 750], 99.990th=[ 881], 00:32:41.713 | 99.999th=[ 1303] 00:32:41.713 bw ( KiB/s): min=138016, max=241640, per=100.00%, avg=148559.58, stdev=7358.69, samples=76 00:32:41.713 iops : min=34504, max=60410, avg=37139.89, stdev=1839.67, samples=76 00:32:41.713 lat (usec) : 10=0.01%, 50=0.76%, 100=8.25%, 250=46.16%, 500=36.92% 00:32:41.713 lat (usec) : 750=6.48%, 1000=1.30% 00:32:41.713 lat (msec) : 2=0.13%, 4=0.01% 00:32:41.713 cpu : usr=99.61%, sys=0.00%, ctx=80, majf=0, minf=103 00:32:41.713 IO depths : 1=7.7%, 2=26.4%, 4=52.7%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:41.713 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:41.713 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:41.713 issued rwts: total=0,369833,369834,0 short=0,0,0,0 dropped=0,0,0,0 00:32:41.713 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:41.713 00:32:41.713 Run status group 0 (all jobs): 00:32:41.713 WRITE: bw=144MiB/s (151MB/s), 144MiB/s-144MiB/s (151MB/s-151MB/s), io=1445MiB (1515MB), run=10001-10001msec 00:32:41.713 TRIM: bw=144MiB/s (151MB/s), 144MiB/s-144MiB/s (151MB/s-151MB/s), io=1445MiB (1515MB), run=10001-10001msec 00:32:41.713 00:32:41.713 real 0m13.708s 00:32:41.713 user 0m46.176s 00:32:41.713 sys 0m0.520s 00:32:41.713 13:58:30 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:41.713 13:58:30 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:41.713 ************************************ 00:32:41.713 END TEST bdev_fio_trim 00:32:41.713 ************************************ 00:32:41.972 13:58:30 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:41.972 13:58:30 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:32:41.972 13:58:30 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:41.972 13:58:30 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:32:41.972 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:41.972 13:58:30 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:32:41.972 00:32:41.972 real 0m27.813s 00:32:41.972 user 1m32.563s 00:32:41.972 sys 0m1.278s 00:32:41.972 13:58:30 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:41.972 13:58:30 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:41.972 ************************************ 00:32:41.972 END TEST bdev_fio 00:32:41.972 ************************************ 00:32:41.972 13:58:30 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:41.972 13:58:30 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:41.972 13:58:30 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:41.972 13:58:30 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:41.972 13:58:30 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:41.972 13:58:30 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:41.972 ************************************ 00:32:41.972 START TEST bdev_verify 00:32:41.972 ************************************ 00:32:41.972 13:58:30 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:41.972 [2024-07-12 13:58:30.470111] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:32:41.972 [2024-07-12 13:58:30.470180] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid621317 ] 00:32:42.231 [2024-07-12 13:58:30.603809] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:42.231 [2024-07-12 13:58:30.715939] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:42.231 [2024-07-12 13:58:30.715939] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:42.231 [2024-07-12 13:58:30.737357] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:42.231 [2024-07-12 13:58:30.745383] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:42.231 [2024-07-12 13:58:30.753401] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:42.488 [2024-07-12 13:58:30.861483] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:45.017 [2024-07-12 13:58:33.075027] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:45.017 [2024-07-12 13:58:33.075115] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:45.017 [2024-07-12 13:58:33.075130] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.017 [2024-07-12 13:58:33.083044] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:45.017 [2024-07-12 13:58:33.083063] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:45.017 [2024-07-12 13:58:33.083075] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.017 [2024-07-12 13:58:33.091065] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:45.017 [2024-07-12 13:58:33.091085] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:45.017 [2024-07-12 13:58:33.091097] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.017 [2024-07-12 13:58:33.099087] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:45.017 [2024-07-12 13:58:33.099105] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:45.017 [2024-07-12 13:58:33.099121] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.017 Running I/O for 5 seconds... 00:32:50.287 00:32:50.287 Latency(us) 00:32:50.287 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:50.287 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:50.287 Verification LBA range: start 0x0 length 0x1000 00:32:50.287 crypto_ram : 5.08 474.20 1.85 0.00 0.00 268892.00 3561.74 162301.33 00:32:50.287 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:50.287 Verification LBA range: start 0x1000 length 0x1000 00:32:50.287 crypto_ram : 5.07 378.49 1.48 0.00 0.00 337164.08 16754.42 202420.76 00:32:50.287 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:50.287 Verification LBA range: start 0x0 length 0x1000 00:32:50.287 crypto_ram2 : 5.08 475.65 1.86 0.00 0.00 267450.68 4074.63 150447.86 00:32:50.287 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:50.287 Verification LBA range: start 0x1000 length 0x1000 00:32:50.287 crypto_ram2 : 5.07 378.40 1.48 0.00 0.00 335979.56 17438.27 185096.46 00:32:50.287 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:50.287 Verification LBA range: start 0x0 length 0x1000 00:32:50.287 crypto_ram3 : 5.06 3665.29 14.32 0.00 0.00 34606.11 5214.39 26670.30 00:32:50.287 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:50.287 Verification LBA range: start 0x1000 length 0x1000 00:32:50.287 crypto_ram3 : 5.07 2956.51 11.55 0.00 0.00 42851.51 3846.68 31229.33 00:32:50.287 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:50.287 Verification LBA range: start 0x0 length 0x1000 00:32:50.287 crypto_ram4 : 5.06 3665.84 14.32 0.00 0.00 34518.35 5385.35 26214.40 00:32:50.287 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:50.287 Verification LBA range: start 0x1000 length 0x1000 00:32:50.287 crypto_ram4 : 5.07 2955.71 11.55 0.00 0.00 42750.68 4160.11 31001.38 00:32:50.287 =================================================================================================================== 00:32:50.287 Total : 14950.08 58.40 0.00 0.00 68010.60 3561.74 202420.76 00:32:50.287 00:32:50.287 real 0m8.318s 00:32:50.287 user 0m15.744s 00:32:50.287 sys 0m0.381s 00:32:50.287 13:58:38 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:50.287 13:58:38 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:50.287 ************************************ 00:32:50.287 END TEST bdev_verify 00:32:50.287 ************************************ 00:32:50.287 13:58:38 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:50.287 13:58:38 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:50.287 13:58:38 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:50.287 13:58:38 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:50.287 13:58:38 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:50.287 ************************************ 00:32:50.287 START TEST bdev_verify_big_io 00:32:50.287 ************************************ 00:32:50.287 13:58:38 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:50.546 [2024-07-12 13:58:38.875096] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:32:50.546 [2024-07-12 13:58:38.875163] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid622379 ] 00:32:50.546 [2024-07-12 13:58:39.007819] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:50.546 [2024-07-12 13:58:39.111751] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:50.546 [2024-07-12 13:58:39.111755] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:50.847 [2024-07-12 13:58:39.133164] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:50.847 [2024-07-12 13:58:39.141184] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:50.847 [2024-07-12 13:58:39.149206] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:50.847 [2024-07-12 13:58:39.261174] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:53.416 [2024-07-12 13:58:41.495770] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:53.416 [2024-07-12 13:58:41.495851] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:53.416 [2024-07-12 13:58:41.495867] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:53.416 [2024-07-12 13:58:41.503785] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:53.416 [2024-07-12 13:58:41.503804] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:53.416 [2024-07-12 13:58:41.503816] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:53.416 [2024-07-12 13:58:41.511809] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:53.416 [2024-07-12 13:58:41.511830] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:53.416 [2024-07-12 13:58:41.511842] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:53.416 [2024-07-12 13:58:41.519832] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:53.416 [2024-07-12 13:58:41.519849] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:53.416 [2024-07-12 13:58:41.519860] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:53.416 Running I/O for 5 seconds... 00:32:53.984 [2024-07-12 13:58:42.531995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:53.984 [2024-07-12 13:58:42.532621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:53.984 [2024-07-12 13:58:42.532846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:53.984 [2024-07-12 13:58:42.532953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:53.984 [2024-07-12 13:58:42.533017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:53.984 [2024-07-12 13:58:42.533479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.535146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.535216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.535268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.535322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.535813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.535873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.535934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.535992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.536429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.537659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.537732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.537788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.537841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.538546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.538607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.538660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.538712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.539063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.540281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.540346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.540398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.540450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.540949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.541008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.541060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.541111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.541481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.543085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.543148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.543199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.543769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.543834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.543893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.545351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.545413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.545465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.545516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.546070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.546131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.546190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.546252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.548210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.548274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.548326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.548383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.548872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.548942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.548994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.549046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.550558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.550621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.550673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.550730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.551225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.551288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.551339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.551390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.552997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.553066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.553123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.553174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.984 [2024-07-12 13:58:42.553705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.553768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.553820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.553871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.555458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.555528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.555586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.555642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.556211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.556270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.556322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.556373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.558188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.558251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.558303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.558355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.558944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.559013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.559067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.559124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.560624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.560696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.560748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.560799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.561370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.561432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.561484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.561537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.563340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.563420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.563478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.563529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.564075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.564139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.564191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:53.985 [2024-07-12 13:58:42.564242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.565905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.565990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.566043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.566095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.566615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.566673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.566726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.566778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.568434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.568497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.568548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.568599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.569186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.569254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.569307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.569361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.570790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.570853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.570905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.570964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.571558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.571616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.571669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.571721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.573372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.573440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.573499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.573553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.574051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.574116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.574172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.574223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.575752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.575815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.575870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.575937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.576644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.576704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.576756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.576807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.578393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.578461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.578515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.578566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.579150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.579209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.579262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.579313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.580904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.580983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.581036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.581102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.581748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.581808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.581862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.581916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.583445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.583513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.583564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.583615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.584141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.584205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.584263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.584314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.585851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.585942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.585996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.586048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.586630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.586691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.246 [2024-07-12 13:58:42.586743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.586795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.588272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.588334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.588386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.588437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.588979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.589039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.589098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.589151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.590577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.590640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.590692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.590743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.591392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.591452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.591509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.591561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.593159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.593241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.593301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.593354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.593843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.593914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.593978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.594030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.595501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.595564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.595617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.595670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.596389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.596449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.596500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.596552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.598192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.598256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.598308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.598361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.598850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.598909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.598966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.599018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.600600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.600663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.600733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.600785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.601397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.601456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.601508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.601559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.603161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.603230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.603286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.603347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.603867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.603937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.603989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.604040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.605656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.605719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.605772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.605823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.606316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.606374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.606434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.606487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.607974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.608036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.608087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.608139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.608654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.608712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.608771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.608823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.610532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.610595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.610646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.610700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.611198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.611263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.611318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.611373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.612818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.614695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.616601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.618500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.619041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.619539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.621532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.623430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.626727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.628651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.629892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.630395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.632695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.634712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.636695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.638701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.640548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.641677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.247 [2024-07-12 13:58:42.643556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.645441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.647320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.649193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.651095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.653072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.656787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.658689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.660084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.661953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.664368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.665531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.666030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.667963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.671227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.673255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.675177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.676883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.678570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.680452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.682351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.684256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.687668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.688182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.688803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.690674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.693102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.694502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.696353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.698243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.701475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.703506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.705529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.707437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.709892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.711885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.713630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.714126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.717550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.718952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.720819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.722720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.723721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.724336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.726205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.728112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.731510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.733409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.734273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.734762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.737221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.739135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.740740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.742639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.744506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.745993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.747848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.749731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.751772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.753646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.755668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.757685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.761519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.763421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.764833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.766692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.769119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.769973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.770466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.772479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.775855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.777762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.779668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.781056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.783006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.784872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.786811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.788838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.792187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.792714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.793211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.794725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.797156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.797670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.798168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.798661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.800765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.801291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.801787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.802292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.803438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.803950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.804446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.804944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.807197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.807703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.808208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.808700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.809743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.248 [2024-07-12 13:58:42.810266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.810761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.811263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.813551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.814060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.814554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.815059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.816195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.816713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.817215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.817708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.820078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.820582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.821106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.821637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.822763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.823295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.823810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.249 [2024-07-12 13:58:42.824321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.826548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.827071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.827571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.828079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.829218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.829748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.830250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.830747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.833261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.833773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.834273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.834768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.835855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.836366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.836863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.837362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.839698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.840212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.840711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.841220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.842248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.842751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.843259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.843758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.845791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.846316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.846812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.847316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.848454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.848968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.510 [2024-07-12 13:58:42.849468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.849965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.852817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.854384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.855489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.857513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.858443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.858964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.860976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.862537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.865893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.868013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.869147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.870886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.872046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.872547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.874557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.876456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.879817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.881976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.883173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.883670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.885986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.887981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.889866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.891782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.893660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.894646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.896511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.898406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.900024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.901891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.903785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.905844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.909575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.911715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.913037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.914902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.917482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.918644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.919144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.920921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.924112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.926095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.928128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.929770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.931329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.933211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.935086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.937225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.940946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.941455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.941954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.943810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.946391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.947571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.949424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.951283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.954343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.956235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.958250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.960105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.962479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.964517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.966180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.966688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.970343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.971536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.973408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.975311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.976242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.976744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.978620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.980507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.983852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.986019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.986807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.987302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.989752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.991947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.993457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.995495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.997341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:42.998745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.000593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.000646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.002658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.003002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.004952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.006976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.009061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.010612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.012393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.012458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.012510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.012564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.012900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.014787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.014852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.014932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.511 [2024-07-12 13:58:43.014985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.016434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.016496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.016555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.016610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.017096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.017279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.017337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.017389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.017441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.018969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.019046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.019101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.019153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.019546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.019729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.019785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.019837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.019888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.021395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.021466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.021518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.021570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.022113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.022292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.022349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.022403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.022456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.023992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.024055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.024106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.024158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.024576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.024756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.024818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.024871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.024922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.026443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.026506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.026576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.026628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.027091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.027273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.027331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.027384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.027438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.028867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.028939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.028991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.029046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.029446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.029626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.029684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.029743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.029796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.031213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.031279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.031331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.031391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.031923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.032110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.032172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.032224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.032276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.033811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.033882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.033944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.033996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.034330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.034509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.034576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.034629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.034686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.036312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.036378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.036430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.036484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.036979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.037156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.037213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.037265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.037316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.038845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.038908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.038967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.039023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.039389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.039569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.039626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.039678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.039729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.041356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.041420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.041472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.041528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.041862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.042050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.042110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.512 [2024-07-12 13:58:43.042168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.042226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.043659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.043722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.043774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.043830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.044210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.044387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.044448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.044508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.044566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.046335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.046397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.046451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.046504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.046836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.047024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.047087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.047139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.047190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.048772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.048834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.048886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.048944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.049362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.049538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.049608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.049662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.049712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.051327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.051391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.051444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.051501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.051835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.052023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.052090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.052142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.052194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.053779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.053841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.053895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.053961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.054297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.054475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.054536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.054588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.054640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.056239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.056306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.056361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.056413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.056788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.056977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.057035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.057087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.057147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.058554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.058624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.058681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.058733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.059147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.059329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.059386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.059437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.059488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.061296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.061364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.061416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.061471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.061842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.062029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.062085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.062143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.062206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.063619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.063685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.063737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.063789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.064167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.064345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.064407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.064466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.064521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.066634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.066697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.066749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.066801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.067184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.067361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.067425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.067494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.067546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.068978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.069041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.069092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.069144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.069575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.069756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.069821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.069878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.513 [2024-07-12 13:58:43.069940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.072037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.072101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.072154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.072210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.072590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.072770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.072829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.072884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.072945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.074402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.074465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.074519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.074571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.074950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.075126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.075207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.075263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.075314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.077351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.077413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.077465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.077517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.077916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.078106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.078168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.078223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.078279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.079769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.079832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.079884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.079944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.080321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.080508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.080570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.080622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.080673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.082570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.082635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.082687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.082750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.083096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.083274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.083331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.083383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.083434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.085038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.085117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.085185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.085257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.085599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.085779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.085841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.085893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.085954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.087634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.087707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.087760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.087817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.088197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.088373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.088431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.088493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.514 [2024-07-12 13:58:43.088545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.090081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.090154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.090206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.090258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.090591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.090768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.090825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.090878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.090940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.092623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.092687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.094571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.094838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.095026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.095086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.095139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.095190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.096604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.097759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.098261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.099564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.099980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.100160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.100945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.102831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.103983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.107289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.108437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.110245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.112052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.112494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.113114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.113612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.114116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.114608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.116784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.117318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.117817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.118316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.118733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.119357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.119873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.120373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.120873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.123250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.123751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.124249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.124744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.125260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.125876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.126384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.126889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.127387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.129846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.130363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.130863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.131369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.131791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.132419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.774 [2024-07-12 13:58:43.132919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.133424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.133921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.136036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.136543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.137049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.137549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.138036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.138646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.139171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.139664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.140160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.142659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.143193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.143691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.144196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.144656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.145275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.145776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.146285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.146776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.148968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.149476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.149976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.150473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.150932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.151542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.152049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.152543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.153042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.155624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.156137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.156633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.157139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.157611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.158233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.158737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.159240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.159733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.161851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.162366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.162862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.163362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.163817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.164440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.164946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.167095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.168555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.170584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.172285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.172789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.174566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.175077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.175683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.177708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.179732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.180241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.183882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.185581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.187444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.189415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.189902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.190620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.192506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.194407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.196294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.199571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.200555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.201061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.203084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.203427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.205554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.207255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.209225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.211127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.213830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.215708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.217592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.219487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.219863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.221842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.223739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.225638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.226149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.229562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.231114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.232988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.234884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.235275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.236633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.237143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.238950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.240854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.244148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.246184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.248208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.248712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.249207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.775 [2024-07-12 13:58:43.251175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.253059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.254944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.256350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.258863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.259378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.261403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.263428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.263768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.265653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.267659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.269559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.271441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.275232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.277138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.279030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.280446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.280846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.282899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.284896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.285412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.286128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.288976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.290847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.292740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.294627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.295018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.295631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.297622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.299642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.301639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.305036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.306981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.307483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.308555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.308961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.310992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.312902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.314307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.316186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.318181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.320212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.322112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.323997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.324372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.326370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.328247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.330136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.331302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.334699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.336616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.338304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.340170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.340532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.342665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.343177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.344048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.345907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.349291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.351202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.353117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:54.776 [2024-07-12 13:58:43.353811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.354364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.356419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.358299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.360234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.361653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.364286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.364790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.366517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.368453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.368797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.370946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.372975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.375000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.376924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.380824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.382728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.384640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.386061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.386451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.388489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.390394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.390896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.391401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.394437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.396323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.398192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.400057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.400467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.401089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.402692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.404550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.406529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.409846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.411883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.412392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.413105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.413508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.415535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.417436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.418808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.420651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.422639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.424588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.426615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.428638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.428991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.431130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.433124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.435014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.436539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.439996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.441876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.443275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.445143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.445520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.447538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.448178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.448674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.450680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.454048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.456038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.457951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.459497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.036 [2024-07-12 13:58:43.460005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.461339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.463193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.465104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.467010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.470403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.470914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.471436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.473305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.473694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.475723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.477152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.479028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.480923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.484003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.485883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.487915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.487992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.488333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.490474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.492404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.494294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.495790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.499351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.499420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.499473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.499533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.499873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.501886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.501967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.502027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.502079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.503537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.503601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.503654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.503708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.504207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.504389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.504451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.504504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.504555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.506099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.506162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.506214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.506274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.506612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.506799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.506856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.506923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.506984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.508418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.508481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.508534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.508589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.509125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.509313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.509375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.509427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.509478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.511072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.511136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.511188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.511240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.511632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.511811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.511882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.511947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.512004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.513821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.513898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.513970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.514038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.514372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.514553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.514610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.514661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.514727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.516423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.516494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.516547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.516598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.516944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.517127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.517197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.517259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.517311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.518831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.518895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.518956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.519015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.519514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.519693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.519753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.519806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.519871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.521732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.521796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.521852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.521905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.522473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.522673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.522748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.522814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.522883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.524522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.524586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.524640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.524705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.525174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.525379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.525445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.525533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.525588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.527435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.527513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.527567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.527631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.528071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.528277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.528335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.528406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.528478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.530327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.530392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.530445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.530498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.531048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.531239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.531309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.531375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.531438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.533149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.533215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.533270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.533334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.533817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.534012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.534087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.534163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.534228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.536039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.536115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.536170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.536234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.536653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.536834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.536894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.536959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.537028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.538885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.538957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.539011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.539064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.539579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.539763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.539832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.539896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.539967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.541650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.541717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.541770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.037 [2024-07-12 13:58:43.541821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.542310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.542496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.542564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.542632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.542685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.544346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.544410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.544462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.544533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.545022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.545204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.545267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.545321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.545373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.547206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.547281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.547333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.547386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.547799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.547991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.548049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.548115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.548179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.550018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.550083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.550137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.550189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.550623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.550808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.550878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.550957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.551023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.552606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.552670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.552722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.552788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.553304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.553484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.553542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.553600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.553652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.555529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.555604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.555657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.555709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.556159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.556339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.556402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.556456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.556520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.558367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.558431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.558483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.558535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.558999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.559181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.559251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.559329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.559382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.560998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.561062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.561113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.561178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.561678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.561859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.561916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.561976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.562031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.563876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.563963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.564016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.564069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.564491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.564671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.564728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.564782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.564846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.566749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.566812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.566867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.566919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.567383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.567565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.567640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.567703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.567767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.569368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.569432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.569485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.569549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.570077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.570261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.570318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.570370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.570425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.572149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.572226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.572280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.572331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.572734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.572932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.572993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.573046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.573130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.575181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.575244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.575297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.575349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.575803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.575995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.576054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.576108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.576171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.578025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.578113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.578176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.578228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.578638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.578818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.578913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.578975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.579028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.580638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.580715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.580768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.580820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.581332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.581512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.581569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.581620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.581678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.583186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.583261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.583313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.583364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.583785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.583972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.584031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.584083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.584134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.585718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.585783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.585850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.586360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.586773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.586960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.587020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.587071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.587126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.588679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.590560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.591062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.591558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.591897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.592082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.038 [2024-07-12 13:58:43.594111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.594612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.595115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.598553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.600614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.602243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.602747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.603186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.605181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.607090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.609235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.610550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.612584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.613252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.039 [2024-07-12 13:58:43.615147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.617167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.617521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.619084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.620964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.622924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.624760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.628657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.630808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.631981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.633847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.634245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.636513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.637052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.637549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.639564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.642981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.644873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.647018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.648162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.648614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.650047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.651920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.653826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.655841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.659295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.659802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.660572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.662426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.662801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.665060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.666239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.668102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.669991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.673340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.675360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.677471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.679054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.679394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.681523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.683681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.684907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.685407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.689092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.690629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.692518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.694496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.694839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.695466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.696259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.698122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.700013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.703478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.705642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.706149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.706645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.706996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.709032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.711187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.712345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.714208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.716258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.717864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.719723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.721728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.722076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.724032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.726066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.728208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.729758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.733491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.735642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.736811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.738700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.739103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.741364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.741872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.298 [2024-07-12 13:58:43.742371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.744263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.747583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.749472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.751601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.752562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.752985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.754658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.756505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.758470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.760373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.763690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.764207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.765120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.766986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.767397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.769659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.770887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.772768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.774665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.778266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.780298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.782443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.783796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.784141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.786140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.788297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.789333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.789827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.793360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.795093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.797018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.799041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.799382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.800001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.800874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.802720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.804614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.808106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.810254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.810758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.811264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.811606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.813636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.815785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.816950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.818801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.820886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.822624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.824522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.826536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.826875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.828902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.830933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.833060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.834542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.838260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.840403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.841695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.843557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.843951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.846149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.846653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.847173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.849027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.852390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.854283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.856421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.857126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.857552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.859378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.861334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.863351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.865048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.867996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.868505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.869679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.871553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.871980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.874256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.875868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.299 [2024-07-12 13:58:43.877812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.879964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.883945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.885851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.887950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.889112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.889525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.891543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.893688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.894197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.894693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.897413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.899293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.901177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.903310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.903721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.904339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.906049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.907939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.909955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.913405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.915054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.915563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.916519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.916886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.918909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.921043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.922203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.924061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.926120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.926627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.927129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.927629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.928049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.930074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.931199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.933213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.935138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.938828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.940410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.942291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.944308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.944676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.945297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.945796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.946299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.946794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.948903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.949423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.949921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.950422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.950845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.951463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.951974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.952473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.952973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.561 [2024-07-12 13:58:43.955210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.955717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.956222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.956718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.957212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.957834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.958345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.958844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.959343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.961529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.962047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.962548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.963057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.963467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.964085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.964586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.965090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.965588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.967856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.968370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.968863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.969361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.969855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.970487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.970998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.971498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.972388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.974496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.975325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.976678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.978550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.979049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.979677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.980586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.981859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.983819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.987240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.989015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.989516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.990018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.990517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.992022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.993897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.994407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.994900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.996916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.997428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.998596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.998656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:43.999146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.001275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.001777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.002274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.003828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.006094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.006174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.006237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.006290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.006630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.007809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.007876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.007935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.007988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.009933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.009996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.010048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.010099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.010639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.010821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.010877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.010937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.010989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.012784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.012850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.012902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.012961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.013371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.013554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.013617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.013670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.013721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.015336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.015400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.015458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.015515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.015858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.016051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.016110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.016162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.016220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.017941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.018017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.018071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.018123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.018463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.018644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.018705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.562 [2024-07-12 13:58:44.018757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.018812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.020699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.020776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.020830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.020899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.021318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.021497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.021577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.021631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.021683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.023355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.023418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.023470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.023538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.023874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.024069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.024155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.024211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.024276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.025754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.025818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.025872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.025932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.026269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.026448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.026507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.026558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.026610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.028192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.028267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.028320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.028372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.028708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.028904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.028978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.029031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.029083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.030660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.030724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.030775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.030834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.031319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.031501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.031558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.031611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.031662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.033896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.033986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.034052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.034107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.034639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.034819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.034877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.034938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.034993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.036558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.036621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.036672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.036723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.037228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.037424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.037492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.037549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.037601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.039136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.039204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.039258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.039311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.039804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.040004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.040067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.040119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.040171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.041626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.041689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.041742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.041793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.042265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.042446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.042502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.042555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.042606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.044072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.044136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.044189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.044241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.044658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.044835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.044891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.044953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.045009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.046504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.563 [2024-07-12 13:58:44.046575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.046628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.046681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.047023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.047207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.047266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.047324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.047375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.049032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.049116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.049169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.049233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.049743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.049924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.049993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.050045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.050101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.051605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.051668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.051741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.051793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.052181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.052359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.052415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.052467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.052518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.054142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.054206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.054258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.054310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.054647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.054827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.054891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.054951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.055008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.056592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.056655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.056707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.056759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.057137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.057316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.057383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.057435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.057486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.059196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.059260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.059312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.059380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.059718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.059896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.059964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.060017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.060069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.061603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.061673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.061728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.061785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.062128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.062307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.062364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.062415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.062467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.064306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.064370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.064422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.064474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.064845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.065031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.065088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.065148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.065204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.066695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.066758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.066811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.066861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.067205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.067382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.067454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.067507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.067558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.069620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.069683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.069735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.069787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.070162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.070339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.070400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.070453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.070504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.072141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.072203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.072255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.072307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.072654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.072836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.072893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.072952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.073004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.074652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.564 [2024-07-12 13:58:44.074723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.074779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.074831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.075172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.075356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.075412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.075463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.075527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.077084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.077155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.077207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.077259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.077595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.077773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.077839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.077893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.077953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.079880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.079950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.080004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.080056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.080392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.080568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.080629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.080680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.080732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.082333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.082395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.082447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.082504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.082855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.083045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.083104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.083166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.083218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.084855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.084934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.084992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.085044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.085383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.085561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.085618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.085677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.085731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.087311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.087378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.087431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.088661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.089168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.089349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.089406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.089462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.089513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.091001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.091969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.093837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.095985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.096324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.096503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.097011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.097503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.099505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.102943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.105023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.107160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.107996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.108442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.110111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.111983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.114086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.115656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.118526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.119051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.120047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.121896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.122244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.124512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.125819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.127676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.565 [2024-07-12 13:58:44.129807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.566 [2024-07-12 13:58:44.133617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.566 [2024-07-12 13:58:44.135809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.566 [2024-07-12 13:58:44.138005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.566 [2024-07-12 13:58:44.138952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.566 [2024-07-12 13:58:44.139335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.141740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.143656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.144156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.144759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.147123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.149002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.151176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.153202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.153745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.154367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.156248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.158394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.160539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.164229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.164739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.165246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.167258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.167598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.169866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.170854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.172712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.174835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.178060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.179949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.182127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.183621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.825 [2024-07-12 13:58:44.183970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.186099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.188234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.189458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.189953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.193441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.194989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.196837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.198890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.199237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.199843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.200701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.202560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.204695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.208279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.210393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.210894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.211386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.211732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.214004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.216152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.217058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.218912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.220979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.222829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.224851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.226995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.227425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.229550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.231642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.233789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.234904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.238733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.240759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.242295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.244189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.244531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.246363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.246860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.247803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.249677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.253117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.255256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.257279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.257795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.258344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.260314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.262446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.264585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.265510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.267451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.267968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.269979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.272119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.272460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.273587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.275450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.277589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.279708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.283265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.285297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.286522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.288390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.288732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.289350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.289849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.290351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.290848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.294455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.295371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.297239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.299248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.299769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.300384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.301920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.303773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.305790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.307984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.308486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.309847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.310649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.311040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.311651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.312155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.312650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.313149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.315239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.315744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.316253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.316751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.317256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.826 [2024-07-12 13:58:44.317876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.318383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.318877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.319373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.321839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.322358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.322855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.323351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.323788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.324416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.324914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.325417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.325911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.328366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.328870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.329380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.329877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.330297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.330910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.331418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.331923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.332432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.334600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.335132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.335627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.336138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.336703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.337322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.337822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.338329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.338821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.341325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.341834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.342336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.342825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.343291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.343909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.344416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.344905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.345407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.347622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.348145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.348651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.349167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.349684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.350300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.350802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.351301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.351795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.354294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.354817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.355322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.355817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.356399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.357016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.357515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.358016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.358508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.360962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.362609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.364667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.365507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.365849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.366958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.367456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.369119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.370992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.374769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.375306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.375800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.377608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.378079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.380175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.380671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.381176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.383189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.385314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.385811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.387833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.389859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.390274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.392138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.392753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.393251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.395012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.398665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.400699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.401201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.401906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.402321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:55.827 [2024-07-12 13:58:44.404612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.406717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.408187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.410059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.412403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.414281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.416410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.418560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.419038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.421007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.423144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.425162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.425657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.429397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.430294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.432166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.434298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.434643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.435269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.435769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.437786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.439920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.443558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.445704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.446410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.446903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.447251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.449388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.451515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.452881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.454894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.456853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.458285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.460138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.462240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.462581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.464041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.465956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.468094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.470013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.473884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.476041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.478182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.478249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.478652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.480631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.482784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.484663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.485158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.488904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.488981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.489044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.489096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.489454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.491458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.491532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.491588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.491640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.493362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.493427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.493478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.493533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.493871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.494062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.494120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.494172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.494223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.089 [2024-07-12 13:58:44.495771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.495841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.495895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.495958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.496299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.496484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.496540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.496592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.496652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.498529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.498593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.498644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.498696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.499090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.499275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.499340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.499401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.499452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.501018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.501081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.501133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.501185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.501522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.501705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.501766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.501818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.501869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.503726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.503790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.503847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.503931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.504268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.504451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.504508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.504559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.504611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.506258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.506328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.506383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.506435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.506772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.506960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.507021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.507080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.507134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.508954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.509017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.509073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.509125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.509462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.509644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.509705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.509757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.509808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.511426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.511489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.511542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.511594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.511939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.512121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.512183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.512234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.512289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.514009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.514071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.514133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.514185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.514521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.514701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.514759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.514810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.514876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.516497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.516565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.516617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.516668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.517160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.517344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.517407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.517460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.517513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.519184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.519248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.519300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.519352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.519692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.519875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.519940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.519993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.520045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.521525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.521588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.521646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.521701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.522232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.522410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.522473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.522525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.522577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.524168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.090 [2024-07-12 13:58:44.524233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.524284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.524336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.524787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.524978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.525037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.525089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.525140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.526544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.526609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.526662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.526714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.527247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.527426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.527483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.527536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.527589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.529139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.529203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.529266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.529321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.529772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.529959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.530018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.530070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.530128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.534194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.534261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.534313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.534365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.534711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.534890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.534956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.535008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.535059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.539490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.539573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.539627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.539681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.540225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.540405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.540463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.540515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.540567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.546087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.546153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.546212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.546286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.546623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.546808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.546865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.546917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.546979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.550938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.551003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.551055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.551107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.551448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.551633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.551691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.551743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.551795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.557118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.557183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.557236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.557290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.557628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.557808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.557877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.557946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.557999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.563629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.563694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.563746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.563822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.564167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.564348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.564404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.564460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.564512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.569317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.569385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.569438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.569493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.569832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.570025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.570084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.570135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.570189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.574373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.574444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.574496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.574547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.574886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.575077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.575156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.575212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.575264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.091 [2024-07-12 13:58:44.580445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.580514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.580577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.580629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.581102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.581281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.581344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.581404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.581457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.586050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.586116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.586168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.586220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.586565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.586746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.586804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.586856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.586908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.590796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.590861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.590913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.590977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.591482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.591664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.591721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.591773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.591825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.596179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.596245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.596297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.596348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.596725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.596909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.596978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.597041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.597091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.601970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.602035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.602088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.602143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.602600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.602780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.602843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.602916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.602979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.608631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.608696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.608747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.608799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.609180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.609361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.609427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.609479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.609531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.613445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.613513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.613582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.613647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.614047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.614225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.614282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.614341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.614396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.615267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.615330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.615385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.615875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.616231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.616419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.616479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.616532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.616585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.618359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.620120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.621951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.623981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.624388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.624569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.626310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.628338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.628834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.631556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.633433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.634133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.634630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.635061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.637206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.637812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.639818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.641404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.643940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.644444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.644945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.645440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.645946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.646560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.647068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.647562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.648061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.650365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.092 [2024-07-12 13:58:44.650869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.651387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.651878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.652361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.652983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.653484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.653994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.654495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.656617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.657130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.657631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.658132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.658667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.659292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.659796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.660297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.660789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.663311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.663846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.664379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.664892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.665418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.666061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.093 [2024-07-12 13:58:44.666568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.908464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.909562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.918513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.920345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.920398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.922204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.922259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.923936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.924207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.924224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.925496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.925898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.927839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.929311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.929725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.931376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.932999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.352 [2024-07-12 13:58:44.933509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.933555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.933826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.933843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.933857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.938038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.938438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.938826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.940259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.942423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.942480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.944247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.944294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.944565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.944588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.944603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.948877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.950785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.951180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.951565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.953715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.953772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.955100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.955146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.955416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.955433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.955447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.959755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.961452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.963154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.964899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.965776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.965832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.966228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.966276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.966548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.966564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.966578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.970887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.972831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.974164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.975853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.978077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.978134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.978522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.978565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.978895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.978911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.978932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.981505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.983223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.983272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.984759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.986956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.988753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.988802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.990506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.990779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.990795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.990809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.993405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.995201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.995246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.997179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.999246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:44.999302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:45.000254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:45.001501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:45.001547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:45.001865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:45.001882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:45.001896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:45.001910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:45.006311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.614 [2024-07-12 13:58:45.006710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.007106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.008264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.010566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.010642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.012569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.012619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.013047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.013064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.013079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.013093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.017582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.019267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.019658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.020049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.022312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.022368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.022786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.022835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.023111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.023128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.023143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.023157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.026471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.026880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.026946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.027335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.028259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.028318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.028709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.028765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.029206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.029224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.029238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.029252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.031018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.031425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.031475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.031864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.032745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.032801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.033198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.033254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.033639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.033656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.033671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.033685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.036162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.036221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.036613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.037013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.037992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.038050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.038437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.038486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.038794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.038811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.038825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.038840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.040837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.041241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.041654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.041723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.042113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.042573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.043084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.043147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.043534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.043584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.043979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.043997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.044012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.044026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.045971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.046377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.046427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.046811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.047177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.047677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.047754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.048157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.048220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.048577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.048594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.048609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.048623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.050334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.050735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.050787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.051178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.051454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.051966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.052023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.052422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.052471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.052811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.052829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.052844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.052859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.615 [2024-07-12 13:58:45.054601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.055011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.055063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.055446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.055794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.056298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.056371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.056762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.056811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.057165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.057182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.057197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.057212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.058383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.058779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.058837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.059234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.059567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.060073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.060127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.060509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.060551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.060921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.060944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.060963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.060977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.062993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.063391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.063436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.063838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.064167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.064659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.064711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.065100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.065143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.065607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.065625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.065640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.065654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.067633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.068042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.068087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.068476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.068804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.069317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.069374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.069759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.069803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.070295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.070314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.070330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.070345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.072226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.072639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.072684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.073088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.073475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.073991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.074050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.074433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.074491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.075021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.075039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.075054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.075068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.076957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.077358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.077403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.077790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.078186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.078681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.078740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.079133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.079176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.079655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.079672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.079686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.079704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.081572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.081980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.082027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.082416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.082882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.083387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.083443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.083834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.083878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.084316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.084333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.084347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.084362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.086264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.086663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.086707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.087107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.087572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.088077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.088149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.088536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.088580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.616 [2024-07-12 13:58:45.089002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.089020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.089034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.089049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.090939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.091337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.091382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.091770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.092264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.092762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.092818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.093214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.093259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.093671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.093687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.093702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.093721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.095611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.096017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.096063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.096451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.096471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.096730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.096745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.097254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.097311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.097858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.098210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.098228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.098244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.102915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.102982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.103376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.103426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.103686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.103703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.105316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.105372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.107067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.107113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.107474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.107491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.107506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.110069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.110120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.110161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.110505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.112385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.112439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.114226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.114272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.114612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.114628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.114642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.118446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.118499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.118540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.118580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.118884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.119391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.119443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.119836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.119879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.120151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.120168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.120182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.123824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.123876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.123916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.123964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.124346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.126386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.126446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.128104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.128149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.128508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.128525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.128543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.131603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.131655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.131696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.131737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.132006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.133596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.133649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.135541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.135585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.135848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.135865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.135879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.139666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.139719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.139760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.139801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.140141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.142033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.142086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.143882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.143932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.617 [2024-07-12 13:58:45.144308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.144325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.144340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.147715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.147767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.147808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.147849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.148176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.149976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.150035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.151133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.151178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.151442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.151459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.151473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.155345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.155398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.155443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.155484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.155867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.156031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.156078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.156467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.156510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.156841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.156857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.156872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.160971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.161023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.161064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.161104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.161368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.161983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.162038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.163710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.163756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.164027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.164044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.164058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.167615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.167669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.167710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.167750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.168144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.169953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.170009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.171709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.171754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.172029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.172047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.172061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.175650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.175702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.175744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.175793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.176063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.176693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.176747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.177137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.177189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.177725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.177743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.177759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.181964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.182021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.183859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.183904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.184313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.186201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.186255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.188184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.188226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.188494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.188510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.188524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.191875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.618 [2024-07-12 13:58:45.191941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.193699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.193741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.194095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.194249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.195950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.195998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.196039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.196304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.196320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.196334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.201677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.201735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.203402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.203449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.203781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.204289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.204342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.204385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.204785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.205061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.205079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.205093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.208977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.209033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.209631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.209675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.209948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.210101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.211435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.211482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.879 [2024-07-12 13:58:45.213176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.213444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.213460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.213474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.217014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.217072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.217112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.217153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.217419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.217577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.219518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.219570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.221483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.221749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.221765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.221780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.226618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.226675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.228466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.228512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.228999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.229156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.229555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.229599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.229996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.230262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.230278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.230292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.235013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.235071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.235112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.237044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.237361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.237514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.239216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.239264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.240951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.241218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.241235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.241250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.242946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.243015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.244631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.244677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.244946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.245103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.246806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.246854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.247671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.247947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.247964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.247978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.251886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.252613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.252680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.253068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.253512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.253668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.255132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.255179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.256504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.256771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.256788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.256802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.260165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.261496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.261544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.263246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.263513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.263669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.265152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.265200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.265585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.265882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.265899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.265913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.268498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.270206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.270255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.271577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.271913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.272072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.273414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.273462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.275231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.275501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.275517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.275531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.278547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.279591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.279640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.280976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.281242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.281397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.283331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.283387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.285292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.285727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.285743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.880 [2024-07-12 13:58:45.285757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.290311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.291984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.292031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.292416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.292754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.292906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.293315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.293364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.295137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.295407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.295426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.295441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.299773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.301187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.301236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.303034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.303304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.303461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.305391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.305441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.305831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.306315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.306334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.306348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.308996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.310868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.310920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.312845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.313328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.313488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.315331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.315377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.317324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.317593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.317611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.317626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.320667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.321303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.321353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.322891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.323164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.323322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.325140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.325191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.326957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.327363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.327386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.327401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.331570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.333517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.333572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.333964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.334394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.334552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.334950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.335016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.336948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.337312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.337330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.337345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.342030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.343738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.343794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.345732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.346007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.346167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.347874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.347930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.348411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.348902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.348923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.348947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.351690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.353656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.353702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.355661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.356137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.356300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.358237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.358292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.359912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.360186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.360205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.360220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.363348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.363753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.363802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.365647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.365919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.366087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.366135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.367867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.367916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.368188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.368207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.368222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.371303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.373247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.881 [2024-07-12 13:58:45.373322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.373367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.373630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.374140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.374195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.374583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.374979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.375251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.375270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.375289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.379419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.379829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.381767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.882 [2024-07-12 13:58:45.382041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.382211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.383935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.383993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.385665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.385950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.385984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.386006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.387586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.388940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.390640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.392358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.392642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.392809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.393557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.393613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.394988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.395265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.395289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.395311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.399441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.400104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.401683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.403623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.403904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.404077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.405822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.405883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.406285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.406564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.406590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.406611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.410918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.411330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.411726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.413229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.413580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.413748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.415458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.415513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.417220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.417499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.417523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.417545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.422309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.423656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.424060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.424456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.424804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.424981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.426578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.426636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.428576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.428856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.428881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.428902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.434507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.436459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.438339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.438802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.439352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.439552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.439966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.440020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.441967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.442294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.442319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.442340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.448902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.450603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.452315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.454025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.454314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.454482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.454887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.454948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.455344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.455624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.455650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:56.882 [2024-07-12 13:58:45.455673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.461240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.461885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.463586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.465522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.465803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.467611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.468668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.468725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.469131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.469524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.469550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.469573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.475029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.476077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.477303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.478649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.478936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.479104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.480919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.480981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.482784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.483180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.483218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.483251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.488456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.490159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.491294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.492419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.492703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.492866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.494584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.494640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.496366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.496647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.496671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.496692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.499940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.501490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.503401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.503798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.504084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.504270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.506105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.506161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.507971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.508254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.508278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.508299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.510896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.511316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.511720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.511790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.512312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.512481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.512902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.512970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.513368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.513763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.513800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.513834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.516143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.516211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.516618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.516671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.145 [2024-07-12 13:58:45.517092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.517603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.517667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.518076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.518476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.518853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.518890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.518932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.522398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.522466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.523282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.523335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.523874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.524061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.524467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.524938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.524997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.525275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.525300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.525322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.529139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.529545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.529961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.530020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.530408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.530917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.530993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.531387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.531438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.531788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.531813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.531835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.533908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.534350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.534755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.535162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.535486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.536007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.536096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.536500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.536557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.536989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.537016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.537040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.539130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.539198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.539592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.539643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.540012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.540520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.540593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.541007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.541073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.541438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.541464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.541491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.543636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.543714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.544116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.544509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.544839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.545367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.545431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.545847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.545906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.546341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.546372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.546395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.548145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.548553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.548956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.549010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.549295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.549807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.549886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.550299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.550357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.550841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.550868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.550890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.552993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.553396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.553453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.553844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.554275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.554785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.554850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.555252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.555306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.555708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.555735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.555758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.557673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.558088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.558142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.558536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.558997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.559512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.559577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.559993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.146 [2024-07-12 13:58:45.560046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.560521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.560547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.560570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.562436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.562842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.562896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.563308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.563715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.564230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.564311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.564704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.564772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.565290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.565315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.565338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.567348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.567754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.567806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.568211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.568602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.569140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.569206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.569604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.569656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.570072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.570097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.570124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.572214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.572620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.572672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.573084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.573420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.573934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.573999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.574388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.574440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.574777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.574802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.574824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.576670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.577091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.577158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.577576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.577988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.578496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.578559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.578957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.579010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.579348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.579372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.579393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.582258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.583568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.583626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.584439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.584746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.585270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.585339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.585749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.585808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.586237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.586263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.586284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.588045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.588453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.588507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.588898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.589224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.591144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.591214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.591610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.591683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.592033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.592058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.592080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.596545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.597741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.597800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.599461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.599742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.601452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.601527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.601919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.601982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.602254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.602278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.602299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.606100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.606784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.606838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.608280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.608597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.609299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.609364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.610852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.610906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.611185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.611210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.611232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.147 [2024-07-12 13:58:45.614895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.616408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.616463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.618153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.618435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.620105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.620168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.621575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.621630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.621953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.621978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.622000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.624336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.625707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.625764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.627457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.627739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.629252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.630181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.630242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.631570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.631847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.631872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.631894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.635655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.636141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.636198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.637891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.638246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.638413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.639538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.639594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.639642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.640003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.640029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.640050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.646320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.647680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.647736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.647784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.648066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.649889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.649956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.650943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.650996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.651270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.651295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.651315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.656314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.656387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.656439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.656493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.656770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.658722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.658785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.660659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.660715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.661154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.661179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.661200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.665412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.665473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.665520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.665575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.666103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.668157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.668219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.668634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.668685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.668967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.668993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.669014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.671610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.671689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.671739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.671793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.672073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.674045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.674109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.675081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.675141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.675411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.675448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.675478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.677440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.677500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.677548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.677597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.677906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.679720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.679784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.680950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.681004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.681277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.681302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.681323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.682341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.682401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.682450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.682498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.148 [2024-07-12 13:58:45.682775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.684309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.684372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.686186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.686241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.686668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.686694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.686716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.687655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.687715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.687765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.687817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.688103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.689559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.689622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.690991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.691045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.691316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.691340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.691361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.692404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.692463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.692511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.692562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.692885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.693059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.693115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.694456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.694510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.694784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.694808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.694829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.695811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.695870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.695918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.695983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.696484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.698470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.698534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.698944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.698998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.699276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.699301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.699322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.700342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.700402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.700451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.700500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.700774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.702414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.702478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.703835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.703891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.704246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.704271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.704293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.705255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.705318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.705366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.705414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.705691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.707222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.707285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.707964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.708017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.708293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.708319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.708339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.709327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.709386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.709434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.710777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.711066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.712977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.713054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.714970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.715024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.715409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.715434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.149 [2024-07-12 13:58:45.715455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.716396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.717770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.717828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.719485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.719778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.719950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.721860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.721922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.722003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.722500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.722527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.150 [2024-07-12 13:58:45.722549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.723589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.724762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.724819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.726113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.726395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.728353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.728415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.728481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.730413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.730824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.730856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.730877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.731813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.731877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.731940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.733731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.734020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.734181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.734906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.734968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.736738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.737259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.737286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.737308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.738324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.738384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.738432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.738480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.738756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.738917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.740450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.740505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.741863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.742142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.742167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.742188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.743159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.744597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.744656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.746026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.746305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.746487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.748437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.748501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.749778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.750062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.750097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.750131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.751139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.751542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.751600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.751648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.751934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.752099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.754053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.754105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.755697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.755976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.756002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.756023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.758241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.758307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.758355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.759712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.760001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.760165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.761967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.762023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.763473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.763768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.411 [2024-07-12 13:58:45.763794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.763816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.764774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.764837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.765409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.765465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.765741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.765916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.767878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.767947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.769890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.770170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.770196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.770217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.772447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.772512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.773870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.773924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.774208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.774371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.776020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.776079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.776474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.776810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.776837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.776863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.781865] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.781937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.783278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.783332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.783607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.783766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.785128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.785185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.786497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.786771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.786795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.786816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.792183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.792249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.793591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.793645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.794004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.794167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.795875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.795939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.797181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.797460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.797484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.797505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.799915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.799985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.801671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.801725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.802125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.802289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.803836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.803891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.804468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.804745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.804770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.804792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.807251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.807319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.808742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.808797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.809078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.809247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.811199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.811261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.812151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.812425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.812449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.812483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.815100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.815165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.816361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.816430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.816708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.816870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.817285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.817340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.819268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.819631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.819656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.819678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.822457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.822538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.824245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.824299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.824574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.824739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.825279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.825343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.827172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.827450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.827474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.827495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.830225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.830291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.412 [2024-07-12 13:58:45.831706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.831760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.832121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.832285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.833720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.833775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.834482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.834765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.834790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.834811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.836996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.837060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.838753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.838807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.839099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.839265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.840875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.840936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.842276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.842595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.842619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.842640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.844185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.844249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.845708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.845761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.846174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.846336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.847844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.847900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.848300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.848575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.848599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.848620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.851387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.851454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.853263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.853317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.853649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.853814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.853869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.855205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.855260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.855530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.855562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.855584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.857225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.857290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.859049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.859113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.859579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.861628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.861717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.862117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.863772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.864113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.864138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.864159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.865057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.865130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.867069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.867675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.867962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.868128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.870078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.870151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.871855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.872133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.872158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.872179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.873063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.873532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.874953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.875896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.876246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.876409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.877993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.878052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.879987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.880274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.880298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.880319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.882351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.883708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.885368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.887291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.887569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.887729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.888454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.888510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.890076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.890553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.890578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.890600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.892383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.893763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.895706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.897341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.897623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.897787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.898605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.413 [2024-07-12 13:58:45.898660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.900599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.900946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.900971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.900992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.906992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.907399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.909337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.909734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.910019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.910186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.911566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.911622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.912987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.913261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.913289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.913311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.919221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.920991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.922935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.923988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.924267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.924435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.925177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.925234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.926622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.927080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.927105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.927126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.929396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.931333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.933281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.933684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.933970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.934138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.935911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.935975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.937371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.937648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.937673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.937694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.939579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.940507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.941941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.942337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.942624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.944059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.945428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.945485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.947184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.947466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.947490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.947511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.950180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.951550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.953251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.954606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.954887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.955060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.955654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.955710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.957196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.957496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.957522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.957545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.959950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.961320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.963016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.964603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.964936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.965101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.966442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.966499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.968017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.968297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.968322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.968348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.971167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.971584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.973253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.973956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.974248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.974414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.975749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.975806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.977476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.977757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.977782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.977803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.980079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.981436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.982853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.982909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.983194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.983357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.985240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.985298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.986418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.986701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.414 [2024-07-12 13:58:45.986738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.415 [2024-07-12 13:58:45.986772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.415 [2024-07-12 13:58:45.988140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.415 [2024-07-12 13:58:45.988204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.415 [2024-07-12 13:58:45.989680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.415 [2024-07-12 13:58:45.989734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:45.990095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:45.991834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:45.991901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:45.993799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:45.995700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:45.996104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:45.996129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:45.996150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:45.998395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:45.998462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.000106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.000162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.000439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.000602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.001950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.002944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.002998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.003276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.003302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.003323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.006335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.007754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.009115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.009171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.009451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.011227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.011292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.012497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.012552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.012884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.012909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.012939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.015488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.016300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.018248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.018648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.018938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.020006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.020068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.021015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.021070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.021355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.677 [2024-07-12 13:58:46.021381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.021404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.024008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.024086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.024876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.024937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.025216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.026665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.026730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.028098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.028154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.028427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.028452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.028473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.033350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.033422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.035295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.036446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.036727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.038064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.038133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.039477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.039533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.039821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.039847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.039868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.040749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.041391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.043100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.043155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.043573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.045633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.045714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.046118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.046172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.046445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.046470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.046492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.047804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.049778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.049843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.051725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.052080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.053381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.053446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.054342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.054395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.054673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.054697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.054718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.057504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.059211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.059269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.059710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.060004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.061195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.061259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.061655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.061721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.062118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.062145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.062166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.063208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.064008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.064064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.065407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.065705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.066225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.066292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.066694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.066751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.067170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.067207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.067230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.068124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.069150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.069206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.070318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.070664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.071187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.071252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.072064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.072123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.072399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.072424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.072446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.073479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.074760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.074819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.075687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.075979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.076484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.076547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.077660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.077720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.078000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.078027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.078047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.079023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.079737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.678 [2024-07-12 13:58:46.079795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.081235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.081657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.083659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.083725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.084124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.084177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.084546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.084572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.084596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.085726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.087108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.087168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.087836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.088125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.088632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.088694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.089101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.089161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.089495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.089530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.089553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.090648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.091069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.091128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.092913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.093421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.093947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.094029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.094426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.094503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.094892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.094918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.094948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.096226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.098157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.098227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.098616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.099024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.099537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.099601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.100237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.100295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.100578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.100603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.100624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.101779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.102202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.102258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.102655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.103020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.103531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.103594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.105398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.105452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.105904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.105936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.105957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.106942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.107355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.107421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.107820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.108246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.110302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.110371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.110763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.110833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.111115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.111141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.111162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.112201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.112649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.112709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.113128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.113410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.114080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.115414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.115475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.116270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.116550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.116589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.116612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.117677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.119086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.119144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.119874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.120167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.120340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.121360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.121416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.121464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.121754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.121779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.121802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.124365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.125276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.125333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.125382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.125665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.679 [2024-07-12 13:58:46.126832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.126897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.127299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.127364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.127774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.127804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.127825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.129740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.129807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.129857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.129909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.130196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.131134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.131198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.131595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.131660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.132076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.132102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.132123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.133099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.133159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.133207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.133266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.133793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.135756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.135821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.136232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.136287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.136652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.136677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.136698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.137786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.137852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.137899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.137957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.138239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.138746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.138810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.140626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.140681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.141137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.141163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.141184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.142310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.142383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.142448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.142497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.142791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.144082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.144145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.145103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.145158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.145432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.145466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.145498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.146689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.146750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.146797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.146862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.147255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.148911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.148982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.149509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.149562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.149836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.149861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.149887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.150836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.150909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.150967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.151031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.151457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.151974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.152059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.153995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.154055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.154598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.154626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.154648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.155576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.155639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.155688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.155741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.156160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.156337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.156393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.156794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.156862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.157199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.157224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.157245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.158432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.158492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.158540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.158588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.158876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.159399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.159470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.159873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.159942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.160344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.680 [2024-07-12 13:58:46.160369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.160391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.161283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.161345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.161396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.161449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.161726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.162809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.162873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.163292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.163355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.163775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.163800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.163822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.164791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.164850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.164897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.164962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.165497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.167326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.167391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.167786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.167839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.168202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.168229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.168251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.169369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.169434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.169482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.171193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.171678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.173737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.173804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.174210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.174263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.174681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.174705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.174726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.175822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.177763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.177834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.178231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.178512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.178677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.179098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.179154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.179203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.179591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.179616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.179637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.180627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.182398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.182455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.182842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.183131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.183636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.183698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.183753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.184160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.184526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.184552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.184574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.185577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.185659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.185709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.186116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.186397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.186559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.186980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.187034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.187428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.187818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.187845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.187869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.188877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.188959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.189009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.189058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.189445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.189607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.681 [2024-07-12 13:58:46.191184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.191240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.191632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.192067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.192094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.192128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.193282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.194577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.194633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.195416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.195697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.195861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.196273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.196341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.196738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.197105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.197132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.197156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.198171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.198579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.198635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.198684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.198971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.199134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.199537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.199594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.201515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.201878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.201904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.201932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.682 [2024-07-12 13:58:46.204402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.204456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.205565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.205991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.206392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.206442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.208219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.208489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.208512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.208527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.209463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.209515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.210756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.210807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.211272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.212781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.212832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.214526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.214796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.214815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.214829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.217690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.217751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.218139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.218187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.218687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.220176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.220227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.221913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.222191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.222211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.222225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.224497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.224557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.226244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.226294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.226705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.228359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.228416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.228911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.229193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.229212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.229227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.232206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.232285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.233858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.233907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.234383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.236089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.236140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.237326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.237595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.237614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.237629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.239841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.239901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.241589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.241637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.242149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.244095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.244151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.244538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.244807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.244826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.244841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.247503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.247560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.249502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.249551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.249993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.682 [2024-07-12 13:58:46.251636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.683 [2024-07-12 13:58:46.251689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.683 [2024-07-12 13:58:46.252517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.683 [2024-07-12 13:58:46.252786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.683 [2024-07-12 13:58:46.252804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.683 [2024-07-12 13:58:46.252819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.255229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.255288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.256850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.256899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.257376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.259325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.259373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.259764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.260039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.260059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.260074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.262894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.262961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.264930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.264977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.265429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.267123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.267174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.268042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.268313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.268332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.268348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.270649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.270715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.271541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.271591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.272011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.272407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.272456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.274353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.274722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.274742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.274757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.277592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.277658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.279597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.279652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.280223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.281656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.281704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.282302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.282570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.282589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.282605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.285435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.285498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.287431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.287491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.287898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.288312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.288364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.290179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.290449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.290468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.290488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.293180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.293239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.294475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.294524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.294974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.295026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.296192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.296242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.296511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.296531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.296547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.298790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.298851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.300340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.300389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.302317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.302376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.303641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.304992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.305262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.944 [2024-07-12 13:58:46.305281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.305296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.306165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.306219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.307052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.308991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.309577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.311266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.311318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.311793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.312077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.312096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.312111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.312979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.314488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.316180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.317273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.317690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.319033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.319085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.320442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.320714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.320732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.320747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.323562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.323970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.325499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.326348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.326775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.328360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.328415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.330348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.330655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.330674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.330689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.332472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.333804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.335746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.337334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.337751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.338733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.338802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.340701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.341137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.341158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.341174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.342390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.344340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.345876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.347240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.347659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.349318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.349370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.350460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.350734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.350754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.350769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.353358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.354675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.356150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.357068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.357516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.358913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.358968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.359353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.359623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.359642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.359657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.362476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.364424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.364851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.366760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.367188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.368547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.368597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.370294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.370565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.370584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.370599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.371826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.373757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.374154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.375617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.377766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.379728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.379777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.381621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.381999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.382019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.382035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.384308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.386161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.388094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.388696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.389114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.389512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.389579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.391505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.945 [2024-07-12 13:58:46.391981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.392000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.392015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.394767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.396319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.398024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.399096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.399512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.400825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.400875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.402220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.402493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.402511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.402526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.405358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.405759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.407250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.408151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.408586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.410193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.410248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.412176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.412491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.412510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.412526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.414368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.415725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.417569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.417617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.418038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.419738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.419789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.420377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.420650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.420673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.420689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.422711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.422767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.423664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.423715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.426028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.426093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.427907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.429609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.429955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.429975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.429990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.432528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.432590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.434527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.434591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.435014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.435789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.437397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.437447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.437858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.437877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.437892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.439819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.441177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.443010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.443060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.445118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.445177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.445774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.445825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.446101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.446120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.446136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.448791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.450649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.451537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.453036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.455517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.455581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.455977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.456023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.456293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.456312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.456327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.458998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.459057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.460810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.460859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.463321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.463385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.465029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.465077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.465428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.465447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.465462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.467134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.467195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.468439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.468486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.470682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.470741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.471136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.471187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.471456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.471475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.946 [2024-07-12 13:58:46.471490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.474326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.476276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.476672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.476722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.478624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.478682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.480054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.480102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.480367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.480385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.480400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.482548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.483157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.483208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.484769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.486994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.487051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.488394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.488442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.488767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.488785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.488800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.489675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.490260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.490316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.491858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.493569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.493627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.495323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.495372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.495673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.495691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.495706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.496578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.497621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.497671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.498794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.501172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.501234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.502939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.502987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.503311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.503330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.503345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.504216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.505253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.505303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.506639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.508605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.508663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.510351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.510398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.510825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.510844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.510863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.511825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.513369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.513420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.514076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.516083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.516145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.518083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.518136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.518405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.518424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.518439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.519308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.520560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.520610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.947 [2024-07-12 13:58:46.521903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.523877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.523944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.525599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.525647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.526034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.526054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.526069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.527035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.528539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.528589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.529257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.531276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.531334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.533268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.533321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.533600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.533619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.533633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.534503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.535920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.535977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.537294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.539603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.539663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.541608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.541657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.542085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.542104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.542119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.543398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.545340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.545393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.545782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.547456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.547512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.549007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.549055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.549322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.549340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.549355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.550291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.552241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.552287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.553837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.556015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.556077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.557634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.557683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.557960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.557980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.557995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.559008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.560623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.560671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.561063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.562777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.562834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.564187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.564234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.564500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.564518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.564533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.565543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.567199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.567257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.569185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.571304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.572544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.572593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.574368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.574756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.574774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.574789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.575657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.576067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.576119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.577987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.578410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.579786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.579836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.579877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.580153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.580172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.580187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.582639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.584588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.584635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.584683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.586709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.586767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.587636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.587693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.587970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.587989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.588004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.210 [2024-07-12 13:58:46.590187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.590244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.590287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.590330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.592254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.592314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.594245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.594292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.594562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.594580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.594595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.595459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.595513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.595556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.595602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.597287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.597344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.598688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.598735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.599011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.599030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.599044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.600058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.600112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.600154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.600195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.601514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.601571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.602798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.602844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.603175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.603194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.603209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.604143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.604196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.604239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.604281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.606426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.606482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.607550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.607596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.607868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.607894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.607909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.608841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.608898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.608949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.608990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.611295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.611357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.612435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.612482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.612755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.612773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.612788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.613749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.613810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.613854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.613895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.616317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.616379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.617995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.618042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.618403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.618421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.618436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.619291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.619345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.619389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.619437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.619846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.619895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.621239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.621286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.621628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.621647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.621662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.622518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.622570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.622623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.622665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.624409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.624464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.625164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.625212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.625479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.625496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.625512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.626708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.626761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.626803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.626854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.628590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.628646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.630343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.630390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.630657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.630675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.630690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.631603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.211 [2024-07-12 13:58:46.631661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.631703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.631744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.633896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.633958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.635728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.635775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.636203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.636222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.636236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.637243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.637296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.637351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.639283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.641061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.641119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.642457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.642504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.642770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.642789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.642805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.643677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.644388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.644456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.646383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.646879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.648244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.648309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.648351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.648617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.648635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.648650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.649602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.650564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.650620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.651808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.653712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.653770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.653812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.655150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.655421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.655439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.655453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.656394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.656452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.656495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.657827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.658246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.660194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.660240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.662151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.662545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.662563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.662577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.663813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.663865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.663913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.663964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.664562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.665258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.665310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.666754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.667065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.667084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.667104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.667980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.668910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.668980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.669372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.669787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.670532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.670586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.671585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.671853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.671872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.671888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.673053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.673458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.673507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.674668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.675091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.675495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.675551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.675956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.676315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.676333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.676348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.677531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.677583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.677626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.678976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.679515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.679919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.679979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.680369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.680779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.680799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.680815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.681835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.681897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.212 [2024-07-12 13:58:46.682291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.682335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.682867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.684042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.684091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.685429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.685699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.685717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.685731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.686945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.687007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.688525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.688573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.689085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.690093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.690143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.691461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.691730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.691748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.691762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.694065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.694124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.694786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.694838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.695369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.696213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.696264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.697540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.697937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.697956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.697971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.699370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.699436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.699830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.699881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.700431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.702229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.702276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.702660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.703050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.703068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.703083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.704461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.704535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.706468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.706521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.707173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.707574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.707626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.708019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.708406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.708423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.708438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.710187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.710247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.710635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.710708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.711278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.711683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.711742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.712136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.712409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.712427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.712442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.713743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.713803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.714207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.714258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.714815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.716235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.716286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.717065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.717338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.717356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.717371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.718873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.718938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.719884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.719939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.720346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.720745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.720803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.721203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.721552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.721570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.721585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.723150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.723209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.723598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.723650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.724149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.724557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.724607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.725001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.725494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.725516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.725531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.726997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.727064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.727449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.213 [2024-07-12 13:58:46.727515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.728154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.728553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.728604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.729001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.729397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.729415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.729431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.731233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.731301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.731696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.731747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.732309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.732714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.732764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.733178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.733478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.733501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.733516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.735099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.735161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.735550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.735600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.736023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.736075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.736472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.736530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.736874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.736893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.736908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.738280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.738346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.738742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.738793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.739703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.739762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.740161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.740556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.740918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.740942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.740958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.742141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.742194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.742584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.742982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.743554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.743961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.744028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.744424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.744832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.744851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.744865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.746180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.746582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.746987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.747387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.747938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.748337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.748417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.748811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.749174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.749193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.749208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.750499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.750900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.751300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.751694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.752270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.752682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.752728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.753150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.753461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.753479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.753494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.755150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.755552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.755956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.756350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.756880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.757289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.757341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.757733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.214 [2024-07-12 13:58:46.758130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.758149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.758164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.759641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.760051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.760447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.760840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.761349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.762075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.762126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.763578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.764011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.764029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.764044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.765490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.765892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.767825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.768226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.768708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.770655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.770709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.771110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.771497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.771516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.771532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.773169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.775118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.776937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.777480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.778037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.779271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.779324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.779709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.780027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.780047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.780062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.781324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.781736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.782137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.782525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.784422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.786194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.215 [2024-07-12 13:58:46.786249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.788052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.788458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.788476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.788491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.791094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.792796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.795734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.795801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.797714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.798064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.798082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.799908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.801286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.803228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.804808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.805308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.806610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.806677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.807066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.807570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.807588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.810339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.812100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.813790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.814686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.815109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.816442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.816492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.817833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.818113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.818131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.819441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.819836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.821403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.821454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.821917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.823287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.823335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.824898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.825255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.825274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.826696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.826754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.827154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.827200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.829501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.829566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.830436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.831929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.832250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.832268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.833585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.833645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.834041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.834090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.834501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.835820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.837625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.837670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.837947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.837965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.839301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.839700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.840098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.840150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.841825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.841884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.843563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.843611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.843913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.843936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.846068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.847764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.849049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.849439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.850671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.850730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.852083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.852130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.852399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.852417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.854536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.854594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.856118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.856181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.857967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.858024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.859753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.859801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.860075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.860094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.477 [2024-07-12 13:58:46.861619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.861681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.863613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.863665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.865357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.865414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.867083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.867131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.867400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.867418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.870220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.871759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.873449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.873498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.874342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.874405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.874808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.874853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.875128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.875147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.877568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.879268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.879318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.880118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.881795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.881852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.883214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.883261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.883525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.883544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.884548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.884956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.885003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.885385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.887026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.887082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.888412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.888460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.888725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.888743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.889815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.891429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.891482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.893408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.895526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.895582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.896584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.896647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.897132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.897150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.898072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.898993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.899044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.900361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.902159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.902218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.903899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.903951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.904292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.904310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.905306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.906668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.906718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.908407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.909172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.909230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.909614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.909663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.910088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.910107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.911085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.912506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.912556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.914238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.915262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.915320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.916742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.916789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.917062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.917081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.918020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.918980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.919049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.919436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.921419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.921476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.922803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.922851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.923128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.923148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.924069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.924471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.924521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.926305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.928009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.928066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.929733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.929780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.930051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.930070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.478 [2024-07-12 13:58:46.931033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.931850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.931899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.933260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.934979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.935035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.936723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.936775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.937120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.937140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.938145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.939504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.939552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.941250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.942015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.942072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.942458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.942501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.942891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.942910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.943947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.945306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.945355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.947036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.948277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.948334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.949658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.949706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.949979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.949998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.950915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.951713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.951762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.952152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.954123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.955466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.955516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.956853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.957129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.957149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.958160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.960109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.960176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.961949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.962440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.964151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.964201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.964243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.964553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.964572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.966755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.968107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.968158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.968200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.970479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.970536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.972441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.972490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.972867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.972885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.975189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.975249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.975289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.975331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.977085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.977141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.977530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.977575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.978015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.978034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.978957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.979011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.979052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.979094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.981400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.981464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.983399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.983453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.983782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.983801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.984719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.984772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.984822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.984865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.986881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.986943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.987717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.987763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.988266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.988285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.989621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.989674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.989717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.989768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.991492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.991549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.993240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.993288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.993559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.479 [2024-07-12 13:58:46.993578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:46.994516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:46.994569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:46.994611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:46.994653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:46.996951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:46.997013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:46.998947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:46.998998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:46.999383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:46.999402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.000631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.000684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.000725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.000767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.003071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.003128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.004674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.004722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.004995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.005014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.005941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.005993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.006035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.006081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.006106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.006387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.006543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.007917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.007971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.009670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.009947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.009966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.010968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.011022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.011067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.011109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.480 [2024-07-12 13:58:47.011351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.011511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.011565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.012922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.012983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.013315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.013340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.014279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.014341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.014392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.014441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.014712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.016152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.016214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.017583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.017638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.017911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.017940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.019046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.019108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.019158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.019211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.019585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.020593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.020663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.022005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.022059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.022331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.022355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.023290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.023349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.023770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.023823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.024104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.025936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.025999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.027402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.027456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.027729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.027753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.029171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.029237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.030554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.030609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.030957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.031120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.032479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.032536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.032584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.032857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.032881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.035558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.035631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.037559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.037623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.037897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.039711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.039772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.039820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.040302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.040784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.040809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.480 [2024-07-12 13:58:47.042153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.042218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.043534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.043588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.043914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.044082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.045775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.045832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.047134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.047411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.047435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.048359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.048420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.048474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.048521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.048794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.048963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.050764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.050819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.051217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.051527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.051564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.053852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.053922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.055245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.481 [2024-07-12 13:58:47.055300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.055570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.055736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.057403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.057464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.058521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.058799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.058825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.061473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.061538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.062475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.062543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.063076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.063262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.063665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.063716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.065624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.065970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.065994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.066938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.066998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.067069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.067480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.068702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.068757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.070095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.070371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.070395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.071334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.071419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:58.740 [2024-07-12 13:58:47.074104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:59.308 00:32:59.308 Latency(us) 00:32:59.308 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:59.308 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:59.308 Verification LBA range: start 0x0 length 0x100 00:32:59.308 crypto_ram : 5.84 43.81 2.74 0.00 0.00 2838747.05 74768.03 2611410.14 00:32:59.308 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:59.308 Verification LBA range: start 0x100 length 0x100 00:32:59.308 crypto_ram : 6.04 39.04 2.44 0.00 0.00 3102077.26 123093.70 3165787.71 00:32:59.308 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:59.308 Verification LBA range: start 0x0 length 0x100 00:32:59.308 crypto_ram2 : 5.84 43.80 2.74 0.00 0.00 2733100.52 74312.13 2611410.14 00:32:59.308 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:59.308 Verification LBA range: start 0x100 length 0x100 00:32:59.308 crypto_ram2 : 6.06 41.72 2.61 0.00 0.00 2839596.16 39891.48 3165787.71 00:32:59.308 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:59.308 Verification LBA range: start 0x0 length 0x100 00:32:59.308 crypto_ram3 : 5.57 276.29 17.27 0.00 0.00 413167.85 27696.08 547083.13 00:32:59.308 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:59.308 Verification LBA range: start 0x100 length 0x100 00:32:59.308 crypto_ram3 : 5.73 223.32 13.96 0.00 0.00 498773.32 75679.83 612733.11 00:32:59.308 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:59.308 Verification LBA range: start 0x0 length 0x100 00:32:59.308 crypto_ram4 : 5.71 295.50 18.47 0.00 0.00 373517.36 2635.69 517905.36 00:32:59.308 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:59.308 Verification LBA range: start 0x100 length 0x100 00:32:59.308 crypto_ram4 : 5.85 240.92 15.06 0.00 0.00 449166.27 4473.54 612733.11 00:32:59.308 =================================================================================================================== 00:32:59.308 Total : 1204.40 75.27 0.00 0.00 782950.96 2635.69 3165787.71 00:32:59.567 00:32:59.567 real 0m9.330s 00:32:59.567 user 0m17.639s 00:32:59.567 sys 0m0.499s 00:32:59.567 13:58:48 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:59.567 13:58:48 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:32:59.567 ************************************ 00:32:59.567 END TEST bdev_verify_big_io 00:32:59.567 ************************************ 00:32:59.825 13:58:48 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:59.825 13:58:48 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:59.825 13:58:48 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:59.825 13:58:48 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:59.825 13:58:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:59.825 ************************************ 00:32:59.825 START TEST bdev_write_zeroes 00:32:59.825 ************************************ 00:32:59.825 13:58:48 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:59.825 [2024-07-12 13:58:48.296426] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:32:59.825 [2024-07-12 13:58:48.296491] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid623606 ] 00:33:00.084 [2024-07-12 13:58:48.424842] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:00.084 [2024-07-12 13:58:48.528756] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:00.084 [2024-07-12 13:58:48.550113] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:00.084 [2024-07-12 13:58:48.558141] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:00.084 [2024-07-12 13:58:48.566157] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:00.342 [2024-07-12 13:58:48.675872] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:02.872 [2024-07-12 13:58:50.905041] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:02.872 [2024-07-12 13:58:50.905126] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:02.872 [2024-07-12 13:58:50.905142] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:02.872 [2024-07-12 13:58:50.913059] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:02.872 [2024-07-12 13:58:50.913081] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:02.872 [2024-07-12 13:58:50.913093] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:02.872 [2024-07-12 13:58:50.921079] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:02.872 [2024-07-12 13:58:50.921098] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:02.873 [2024-07-12 13:58:50.921110] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:02.873 [2024-07-12 13:58:50.929100] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:02.873 [2024-07-12 13:58:50.929118] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:02.873 [2024-07-12 13:58:50.929130] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:02.873 Running I/O for 1 seconds... 00:33:03.808 00:33:03.808 Latency(us) 00:33:03.808 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:03.808 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:03.808 crypto_ram : 1.03 1972.13 7.70 0.00 0.00 64350.19 5442.34 77047.54 00:33:03.808 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:03.808 crypto_ram2 : 1.03 1977.86 7.73 0.00 0.00 63820.23 5385.35 71576.71 00:33:03.808 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:03.808 crypto_ram3 : 1.02 15169.85 59.26 0.00 0.00 8308.50 2464.72 10770.70 00:33:03.808 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:03.808 crypto_ram4 : 1.02 15206.90 59.40 0.00 0.00 8263.02 2450.48 8662.15 00:33:03.808 =================================================================================================================== 00:33:03.808 Total : 34326.75 134.09 0.00 0.00 14734.09 2450.48 77047.54 00:33:04.067 00:33:04.067 real 0m4.258s 00:33:04.067 user 0m3.816s 00:33:04.067 sys 0m0.404s 00:33:04.067 13:58:52 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:04.067 13:58:52 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:04.067 ************************************ 00:33:04.067 END TEST bdev_write_zeroes 00:33:04.067 ************************************ 00:33:04.067 13:58:52 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:04.068 13:58:52 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:04.068 13:58:52 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:04.068 13:58:52 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:04.068 13:58:52 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:04.068 ************************************ 00:33:04.068 START TEST bdev_json_nonenclosed 00:33:04.068 ************************************ 00:33:04.068 13:58:52 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:04.068 [2024-07-12 13:58:52.642698] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:04.068 [2024-07-12 13:58:52.642761] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid624147 ] 00:33:04.326 [2024-07-12 13:58:52.770430] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:04.326 [2024-07-12 13:58:52.876605] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:04.326 [2024-07-12 13:58:52.876679] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:04.326 [2024-07-12 13:58:52.876701] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:04.326 [2024-07-12 13:58:52.876715] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:04.584 00:33:04.584 real 0m0.401s 00:33:04.584 user 0m0.241s 00:33:04.584 sys 0m0.158s 00:33:04.584 13:58:52 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:04.584 13:58:52 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:04.584 13:58:52 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:04.584 ************************************ 00:33:04.584 END TEST bdev_json_nonenclosed 00:33:04.584 ************************************ 00:33:04.584 13:58:53 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:04.584 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:33:04.584 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:04.584 13:58:53 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:04.584 13:58:53 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:04.584 13:58:53 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:04.584 ************************************ 00:33:04.584 START TEST bdev_json_nonarray 00:33:04.584 ************************************ 00:33:04.584 13:58:53 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:04.584 [2024-07-12 13:58:53.126636] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:04.584 [2024-07-12 13:58:53.126685] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid624238 ] 00:33:04.842 [2024-07-12 13:58:53.242052] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:04.842 [2024-07-12 13:58:53.346711] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:04.842 [2024-07-12 13:58:53.346800] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:04.842 [2024-07-12 13:58:53.346823] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:04.842 [2024-07-12 13:58:53.346836] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:05.100 00:33:05.101 real 0m0.384s 00:33:05.101 user 0m0.229s 00:33:05.101 sys 0m0.152s 00:33:05.101 13:58:53 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:05.101 13:58:53 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:05.101 13:58:53 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:05.101 ************************************ 00:33:05.101 END TEST bdev_json_nonarray 00:33:05.101 ************************************ 00:33:05.101 13:58:53 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:33:05.101 13:58:53 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:33:05.101 00:33:05.101 real 1m13.531s 00:33:05.101 user 2m42.225s 00:33:05.101 sys 0m9.308s 00:33:05.101 13:58:53 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:05.101 13:58:53 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:05.101 ************************************ 00:33:05.101 END TEST blockdev_crypto_aesni 00:33:05.101 ************************************ 00:33:05.101 13:58:53 -- common/autotest_common.sh@1142 -- # return 0 00:33:05.101 13:58:53 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:05.101 13:58:53 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:05.101 13:58:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:05.101 13:58:53 -- common/autotest_common.sh@10 -- # set +x 00:33:05.101 ************************************ 00:33:05.101 START TEST blockdev_crypto_sw 00:33:05.101 ************************************ 00:33:05.101 13:58:53 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:05.359 * Looking for test storage... 00:33:05.359 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:05.359 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:05.359 13:58:53 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:33:05.359 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:05.359 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=624399 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 624399 00:33:05.360 13:58:53 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 624399 ']' 00:33:05.360 13:58:53 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:05.360 13:58:53 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:05.360 13:58:53 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:05.360 13:58:53 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:05.360 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:05.360 13:58:53 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:05.360 13:58:53 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:05.360 [2024-07-12 13:58:53.795073] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:05.360 [2024-07-12 13:58:53.795148] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid624399 ] 00:33:05.360 [2024-07-12 13:58:53.924547] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:05.618 [2024-07-12 13:58:54.029388] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:06.185 13:58:54 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:06.185 13:58:54 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:33:06.185 13:58:54 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:33:06.185 13:58:54 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:33:06.185 13:58:54 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:33:06.185 13:58:54 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:06.185 13:58:54 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:06.444 Malloc0 00:33:06.444 Malloc1 00:33:06.444 true 00:33:06.444 true 00:33:06.444 true 00:33:06.444 [2024-07-12 13:58:55.000797] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:06.444 crypto_ram 00:33:06.444 [2024-07-12 13:58:55.008824] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:06.444 crypto_ram2 00:33:06.444 [2024-07-12 13:58:55.016845] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:06.444 crypto_ram3 00:33:06.702 [ 00:33:06.702 { 00:33:06.702 "name": "Malloc1", 00:33:06.702 "aliases": [ 00:33:06.702 "12022974-c8d9-46ad-be2f-668efaf57e32" 00:33:06.702 ], 00:33:06.702 "product_name": "Malloc disk", 00:33:06.702 "block_size": 4096, 00:33:06.702 "num_blocks": 4096, 00:33:06.702 "uuid": "12022974-c8d9-46ad-be2f-668efaf57e32", 00:33:06.702 "assigned_rate_limits": { 00:33:06.702 "rw_ios_per_sec": 0, 00:33:06.702 "rw_mbytes_per_sec": 0, 00:33:06.702 "r_mbytes_per_sec": 0, 00:33:06.702 "w_mbytes_per_sec": 0 00:33:06.702 }, 00:33:06.702 "claimed": true, 00:33:06.702 "claim_type": "exclusive_write", 00:33:06.702 "zoned": false, 00:33:06.702 "supported_io_types": { 00:33:06.702 "read": true, 00:33:06.702 "write": true, 00:33:06.702 "unmap": true, 00:33:06.702 "flush": true, 00:33:06.702 "reset": true, 00:33:06.702 "nvme_admin": false, 00:33:06.702 "nvme_io": false, 00:33:06.702 "nvme_io_md": false, 00:33:06.702 "write_zeroes": true, 00:33:06.702 "zcopy": true, 00:33:06.702 "get_zone_info": false, 00:33:06.702 "zone_management": false, 00:33:06.702 "zone_append": false, 00:33:06.702 "compare": false, 00:33:06.702 "compare_and_write": false, 00:33:06.702 "abort": true, 00:33:06.702 "seek_hole": false, 00:33:06.702 "seek_data": false, 00:33:06.702 "copy": true, 00:33:06.702 "nvme_iov_md": false 00:33:06.702 }, 00:33:06.702 "memory_domains": [ 00:33:06.702 { 00:33:06.702 "dma_device_id": "system", 00:33:06.702 "dma_device_type": 1 00:33:06.702 }, 00:33:06.702 { 00:33:06.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:06.702 "dma_device_type": 2 00:33:06.702 } 00:33:06.702 ], 00:33:06.702 "driver_specific": {} 00:33:06.702 } 00:33:06.702 ] 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:06.702 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:06.702 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:33:06.702 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:06.702 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:06.702 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:06.702 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:06.703 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:06.703 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:06.703 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:33:06.703 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:33:06.703 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:33:06.703 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:06.703 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:06.703 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:06.703 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:33:06.703 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:33:06.703 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4eac4bb1-1788-5665-b470-34cf74efc045"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "4eac4bb1-1788-5665-b470-34cf74efc045",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6a09f2ea-3a15-53a5-9f69-d4dbf932ba37"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "6a09f2ea-3a15-53a5-9f69-d4dbf932ba37",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:06.703 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:33:06.703 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:33:06.703 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:33:06.703 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 624399 00:33:06.703 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 624399 ']' 00:33:06.703 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 624399 00:33:06.703 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:33:06.961 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:06.961 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 624399 00:33:06.961 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:06.961 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:06.961 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 624399' 00:33:06.961 killing process with pid 624399 00:33:06.961 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 624399 00:33:06.961 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 624399 00:33:07.220 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:07.220 13:58:55 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:07.220 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:07.220 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:07.220 13:58:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:07.220 ************************************ 00:33:07.220 START TEST bdev_hello_world 00:33:07.220 ************************************ 00:33:07.220 13:58:55 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:07.478 [2024-07-12 13:58:55.826799] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:07.478 [2024-07-12 13:58:55.826867] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid624614 ] 00:33:07.478 [2024-07-12 13:58:55.954282] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:07.478 [2024-07-12 13:58:56.056365] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:07.736 [2024-07-12 13:58:56.242303] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:07.736 [2024-07-12 13:58:56.242380] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:07.736 [2024-07-12 13:58:56.242395] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:07.736 [2024-07-12 13:58:56.250323] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:07.736 [2024-07-12 13:58:56.250344] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:07.736 [2024-07-12 13:58:56.250356] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:07.736 [2024-07-12 13:58:56.258344] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:07.736 [2024-07-12 13:58:56.258362] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:07.736 [2024-07-12 13:58:56.258375] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:07.736 [2024-07-12 13:58:56.299896] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:07.736 [2024-07-12 13:58:56.299942] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:07.736 [2024-07-12 13:58:56.299961] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:07.736 [2024-07-12 13:58:56.301702] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:07.736 [2024-07-12 13:58:56.301770] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:07.736 [2024-07-12 13:58:56.301786] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:07.736 [2024-07-12 13:58:56.301820] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:07.736 00:33:07.736 [2024-07-12 13:58:56.301838] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:07.995 00:33:07.995 real 0m0.765s 00:33:07.995 user 0m0.491s 00:33:07.995 sys 0m0.252s 00:33:07.995 13:58:56 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:07.995 13:58:56 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:07.995 ************************************ 00:33:07.995 END TEST bdev_hello_world 00:33:07.995 ************************************ 00:33:07.995 13:58:56 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:07.995 13:58:56 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:33:07.995 13:58:56 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:07.995 13:58:56 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:08.254 13:58:56 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:08.254 ************************************ 00:33:08.254 START TEST bdev_bounds 00:33:08.254 ************************************ 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=624793 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 624793' 00:33:08.254 Process bdevio pid: 624793 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 624793 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 624793 ']' 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:08.254 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:08.254 13:58:56 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:08.254 [2024-07-12 13:58:56.679427] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:08.254 [2024-07-12 13:58:56.679498] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid624793 ] 00:33:08.254 [2024-07-12 13:58:56.814640] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:08.512 [2024-07-12 13:58:56.924640] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:08.512 [2024-07-12 13:58:56.924682] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:08.512 [2024-07-12 13:58:56.924681] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:08.770 [2024-07-12 13:58:57.103192] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:08.770 [2024-07-12 13:58:57.103259] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:08.770 [2024-07-12 13:58:57.103275] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:08.770 [2024-07-12 13:58:57.111211] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:08.770 [2024-07-12 13:58:57.111231] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:08.770 [2024-07-12 13:58:57.111243] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:08.770 [2024-07-12 13:58:57.119233] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:08.770 [2024-07-12 13:58:57.119251] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:08.770 [2024-07-12 13:58:57.119263] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:09.335 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:09.335 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:09.335 13:58:57 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:09.335 I/O targets: 00:33:09.335 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:33:09.335 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:33:09.335 00:33:09.335 00:33:09.335 CUnit - A unit testing framework for C - Version 2.1-3 00:33:09.335 http://cunit.sourceforge.net/ 00:33:09.335 00:33:09.335 00:33:09.335 Suite: bdevio tests on: crypto_ram3 00:33:09.335 Test: blockdev write read block ...passed 00:33:09.335 Test: blockdev write zeroes read block ...passed 00:33:09.335 Test: blockdev write zeroes read no split ...passed 00:33:09.335 Test: blockdev write zeroes read split ...passed 00:33:09.335 Test: blockdev write zeroes read split partial ...passed 00:33:09.335 Test: blockdev reset ...passed 00:33:09.335 Test: blockdev write read 8 blocks ...passed 00:33:09.335 Test: blockdev write read size > 128k ...passed 00:33:09.335 Test: blockdev write read invalid size ...passed 00:33:09.335 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:09.335 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:09.335 Test: blockdev write read max offset ...passed 00:33:09.335 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:09.335 Test: blockdev writev readv 8 blocks ...passed 00:33:09.335 Test: blockdev writev readv 30 x 1block ...passed 00:33:09.335 Test: blockdev writev readv block ...passed 00:33:09.335 Test: blockdev writev readv size > 128k ...passed 00:33:09.335 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:09.335 Test: blockdev comparev and writev ...passed 00:33:09.335 Test: blockdev nvme passthru rw ...passed 00:33:09.335 Test: blockdev nvme passthru vendor specific ...passed 00:33:09.335 Test: blockdev nvme admin passthru ...passed 00:33:09.335 Test: blockdev copy ...passed 00:33:09.335 Suite: bdevio tests on: crypto_ram 00:33:09.335 Test: blockdev write read block ...passed 00:33:09.335 Test: blockdev write zeroes read block ...passed 00:33:09.335 Test: blockdev write zeroes read no split ...passed 00:33:09.335 Test: blockdev write zeroes read split ...passed 00:33:09.335 Test: blockdev write zeroes read split partial ...passed 00:33:09.335 Test: blockdev reset ...passed 00:33:09.335 Test: blockdev write read 8 blocks ...passed 00:33:09.335 Test: blockdev write read size > 128k ...passed 00:33:09.335 Test: blockdev write read invalid size ...passed 00:33:09.335 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:09.335 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:09.335 Test: blockdev write read max offset ...passed 00:33:09.335 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:09.335 Test: blockdev writev readv 8 blocks ...passed 00:33:09.335 Test: blockdev writev readv 30 x 1block ...passed 00:33:09.335 Test: blockdev writev readv block ...passed 00:33:09.335 Test: blockdev writev readv size > 128k ...passed 00:33:09.335 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:09.335 Test: blockdev comparev and writev ...passed 00:33:09.335 Test: blockdev nvme passthru rw ...passed 00:33:09.335 Test: blockdev nvme passthru vendor specific ...passed 00:33:09.335 Test: blockdev nvme admin passthru ...passed 00:33:09.335 Test: blockdev copy ...passed 00:33:09.335 00:33:09.335 Run Summary: Type Total Ran Passed Failed Inactive 00:33:09.335 suites 2 2 n/a 0 0 00:33:09.335 tests 46 46 46 0 0 00:33:09.335 asserts 260 260 260 0 n/a 00:33:09.335 00:33:09.335 Elapsed time = 0.200 seconds 00:33:09.335 0 00:33:09.335 13:58:57 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 624793 00:33:09.335 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 624793 ']' 00:33:09.335 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 624793 00:33:09.335 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:09.335 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:09.335 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 624793 00:33:09.335 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:09.336 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:09.336 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 624793' 00:33:09.336 killing process with pid 624793 00:33:09.336 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 624793 00:33:09.336 13:58:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 624793 00:33:09.593 13:58:58 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:33:09.593 00:33:09.593 real 0m1.526s 00:33:09.593 user 0m3.889s 00:33:09.593 sys 0m0.395s 00:33:09.593 13:58:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:09.593 13:58:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:09.593 ************************************ 00:33:09.593 END TEST bdev_bounds 00:33:09.593 ************************************ 00:33:09.852 13:58:58 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:09.852 13:58:58 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:09.852 13:58:58 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:09.852 13:58:58 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:09.852 13:58:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:09.852 ************************************ 00:33:09.853 START TEST bdev_nbd 00:33:09.853 ************************************ 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=625004 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 625004 /var/tmp/spdk-nbd.sock 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 625004 ']' 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:09.853 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:09.853 13:58:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:09.853 [2024-07-12 13:58:58.294304] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:09.853 [2024-07-12 13:58:58.294360] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:09.853 [2024-07-12 13:58:58.407744] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:10.112 [2024-07-12 13:58:58.510316] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:10.112 [2024-07-12 13:58:58.689906] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:10.112 [2024-07-12 13:58:58.689994] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:10.112 [2024-07-12 13:58:58.690011] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:10.370 [2024-07-12 13:58:58.697922] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:10.370 [2024-07-12 13:58:58.697948] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:10.370 [2024-07-12 13:58:58.697960] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:10.370 [2024-07-12 13:58:58.705949] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:10.370 [2024-07-12 13:58:58.705968] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:10.370 [2024-07-12 13:58:58.705980] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:10.936 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:11.194 1+0 records in 00:33:11.194 1+0 records out 00:33:11.194 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256213 s, 16.0 MB/s 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:11.194 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:11.451 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:11.452 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:11.452 1+0 records in 00:33:11.452 1+0 records out 00:33:11.452 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000359199 s, 11.4 MB/s 00:33:11.452 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:11.452 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:11.452 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:11.452 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:11.452 13:58:59 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:11.452 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:11.452 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:11.452 13:58:59 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:11.708 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:11.708 { 00:33:11.708 "nbd_device": "/dev/nbd0", 00:33:11.708 "bdev_name": "crypto_ram" 00:33:11.708 }, 00:33:11.708 { 00:33:11.708 "nbd_device": "/dev/nbd1", 00:33:11.708 "bdev_name": "crypto_ram3" 00:33:11.708 } 00:33:11.708 ]' 00:33:11.708 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:11.708 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:11.708 { 00:33:11.708 "nbd_device": "/dev/nbd0", 00:33:11.708 "bdev_name": "crypto_ram" 00:33:11.708 }, 00:33:11.708 { 00:33:11.708 "nbd_device": "/dev/nbd1", 00:33:11.708 "bdev_name": "crypto_ram3" 00:33:11.708 } 00:33:11.708 ]' 00:33:11.708 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:11.708 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:11.709 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:11.709 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:11.709 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:11.709 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:11.709 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:11.709 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:11.966 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:11.966 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:11.966 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:11.966 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:11.966 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:11.966 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:11.966 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:11.966 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:11.966 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:11.966 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:12.224 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:12.482 13:59:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:12.741 /dev/nbd0 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:12.741 1+0 records in 00:33:12.741 1+0 records out 00:33:12.741 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024932 s, 16.4 MB/s 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:12.741 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:33:13.059 /dev/nbd1 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:13.059 1+0 records in 00:33:13.059 1+0 records out 00:33:13.059 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000323316 s, 12.7 MB/s 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:13.059 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:13.346 { 00:33:13.346 "nbd_device": "/dev/nbd0", 00:33:13.346 "bdev_name": "crypto_ram" 00:33:13.346 }, 00:33:13.346 { 00:33:13.346 "nbd_device": "/dev/nbd1", 00:33:13.346 "bdev_name": "crypto_ram3" 00:33:13.346 } 00:33:13.346 ]' 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:13.346 { 00:33:13.346 "nbd_device": "/dev/nbd0", 00:33:13.346 "bdev_name": "crypto_ram" 00:33:13.346 }, 00:33:13.346 { 00:33:13.346 "nbd_device": "/dev/nbd1", 00:33:13.346 "bdev_name": "crypto_ram3" 00:33:13.346 } 00:33:13.346 ]' 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:13.346 /dev/nbd1' 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:13.346 /dev/nbd1' 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:13.346 256+0 records in 00:33:13.346 256+0 records out 00:33:13.346 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113982 s, 92.0 MB/s 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:13.346 256+0 records in 00:33:13.346 256+0 records out 00:33:13.346 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0244089 s, 43.0 MB/s 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:13.346 256+0 records in 00:33:13.346 256+0 records out 00:33:13.346 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0460066 s, 22.8 MB/s 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:13.346 13:59:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:13.639 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:13.639 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:13.639 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:13.639 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:13.639 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:13.639 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:13.639 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:13.639 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:13.639 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:13.639 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:13.897 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:14.156 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:14.414 malloc_lvol_verify 00:33:14.414 13:59:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:14.672 0c468380-8156-4f75-b761-aa47c4865457 00:33:14.672 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:14.930 dd779f0f-ea51-43e0-962c-56dcf113ef31 00:33:14.930 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:15.189 /dev/nbd0 00:33:15.189 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:15.189 mke2fs 1.46.5 (30-Dec-2021) 00:33:15.189 Discarding device blocks: 0/4096 done 00:33:15.189 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:15.189 00:33:15.189 Allocating group tables: 0/1 done 00:33:15.189 Writing inode tables: 0/1 done 00:33:15.189 Creating journal (1024 blocks): done 00:33:15.189 Writing superblocks and filesystem accounting information: 0/1 done 00:33:15.189 00:33:15.189 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:15.189 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:15.189 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:15.189 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:15.189 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:15.189 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:15.189 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:15.189 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 625004 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 625004 ']' 00:33:15.448 13:59:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 625004 00:33:15.448 13:59:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:15.448 13:59:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:15.448 13:59:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 625004 00:33:15.707 13:59:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:15.707 13:59:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:15.707 13:59:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 625004' 00:33:15.707 killing process with pid 625004 00:33:15.707 13:59:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 625004 00:33:15.707 13:59:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 625004 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:15.967 00:33:15.967 real 0m6.058s 00:33:15.967 user 0m8.608s 00:33:15.967 sys 0m2.451s 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:15.967 ************************************ 00:33:15.967 END TEST bdev_nbd 00:33:15.967 ************************************ 00:33:15.967 13:59:04 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:15.967 13:59:04 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:15.967 13:59:04 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:33:15.967 13:59:04 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:33:15.967 13:59:04 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:15.967 13:59:04 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:15.967 13:59:04 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:15.967 13:59:04 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:15.967 ************************************ 00:33:15.967 START TEST bdev_fio 00:33:15.967 ************************************ 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:15.967 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:15.967 ************************************ 00:33:15.967 START TEST bdev_fio_rw_verify 00:33:15.967 ************************************ 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:15.967 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:15.968 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:16.237 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:16.237 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:16.237 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:16.237 13:59:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:16.497 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:16.497 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:16.497 fio-3.35 00:33:16.497 Starting 2 threads 00:33:28.709 00:33:28.709 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=626129: Fri Jul 12 13:59:15 2024 00:33:28.709 read: IOPS=21.7k, BW=84.6MiB/s (88.8MB/s)(846MiB/10000msec) 00:33:28.709 slat (usec): min=14, max=487, avg=20.10, stdev= 3.71 00:33:28.709 clat (usec): min=7, max=775, avg=146.80, stdev=59.01 00:33:28.709 lat (usec): min=27, max=796, avg=166.90, stdev=60.47 00:33:28.709 clat percentiles (usec): 00:33:28.709 | 50.000th=[ 143], 99.000th=[ 281], 99.900th=[ 302], 99.990th=[ 347], 00:33:28.709 | 99.999th=[ 725] 00:33:28.709 write: IOPS=26.0k, BW=102MiB/s (107MB/s)(963MiB/9479msec); 0 zone resets 00:33:28.709 slat (usec): min=14, max=414, avg=34.08, stdev= 4.56 00:33:28.709 clat (usec): min=26, max=956, avg=197.59, stdev=90.70 00:33:28.709 lat (usec): min=55, max=1109, avg=231.67, stdev=92.37 00:33:28.709 clat percentiles (usec): 00:33:28.709 | 50.000th=[ 192], 99.000th=[ 392], 99.900th=[ 416], 99.990th=[ 644], 00:33:28.709 | 99.999th=[ 889] 00:33:28.709 bw ( KiB/s): min=92704, max=105160, per=94.83%, avg=98670.74, stdev=1624.63, samples=38 00:33:28.709 iops : min=23176, max=26290, avg=24667.68, stdev=406.16, samples=38 00:33:28.709 lat (usec) : 10=0.01%, 20=0.01%, 50=4.56%, 100=14.78%, 250=62.55% 00:33:28.709 lat (usec) : 500=18.08%, 750=0.02%, 1000=0.01% 00:33:28.709 cpu : usr=99.56%, sys=0.01%, ctx=42, majf=0, minf=380 00:33:28.709 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:28.709 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:28.709 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:28.709 issued rwts: total=216687,246566,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:28.709 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:28.709 00:33:28.709 Run status group 0 (all jobs): 00:33:28.709 READ: bw=84.6MiB/s (88.8MB/s), 84.6MiB/s-84.6MiB/s (88.8MB/s-88.8MB/s), io=846MiB (888MB), run=10000-10000msec 00:33:28.709 WRITE: bw=102MiB/s (107MB/s), 102MiB/s-102MiB/s (107MB/s-107MB/s), io=963MiB (1010MB), run=9479-9479msec 00:33:28.709 00:33:28.709 real 0m11.270s 00:33:28.709 user 0m24.122s 00:33:28.709 sys 0m0.365s 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:28.709 ************************************ 00:33:28.709 END TEST bdev_fio_rw_verify 00:33:28.709 ************************************ 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4eac4bb1-1788-5665-b470-34cf74efc045"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "4eac4bb1-1788-5665-b470-34cf74efc045",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6a09f2ea-3a15-53a5-9f69-d4dbf932ba37"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "6a09f2ea-3a15-53a5-9f69-d4dbf932ba37",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:28.709 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:28.709 crypto_ram3 ]] 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4eac4bb1-1788-5665-b470-34cf74efc045"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "4eac4bb1-1788-5665-b470-34cf74efc045",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6a09f2ea-3a15-53a5-9f69-d4dbf932ba37"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "6a09f2ea-3a15-53a5-9f69-d4dbf932ba37",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:28.710 ************************************ 00:33:28.710 START TEST bdev_fio_trim 00:33:28.710 ************************************ 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:28.710 13:59:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:28.710 13:59:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:28.710 13:59:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:28.710 13:59:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:28.710 13:59:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:28.710 13:59:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:28.710 13:59:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:28.710 13:59:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:28.710 13:59:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:28.710 13:59:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:28.710 13:59:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.710 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:28.710 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:28.710 fio-3.35 00:33:28.710 Starting 2 threads 00:33:38.696 00:33:38.696 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=627690: Fri Jul 12 13:59:26 2024 00:33:38.696 write: IOPS=17.7k, BW=69.2MiB/s (72.6MB/s)(693MiB/10001msec); 0 zone resets 00:33:38.696 slat (usec): min=32, max=979, avg=49.65, stdev=10.00 00:33:38.696 clat (usec): min=74, max=3684, avg=371.28, stdev=204.22 00:33:38.696 lat (usec): min=124, max=3724, avg=420.93, stdev=211.64 00:33:38.696 clat percentiles (usec): 00:33:38.696 | 50.000th=[ 297], 99.000th=[ 758], 99.900th=[ 816], 99.990th=[ 1057], 00:33:38.696 | 99.999th=[ 3621] 00:33:38.696 bw ( KiB/s): min=69560, max=74816, per=100.00%, avg=71008.42, stdev=832.27, samples=38 00:33:38.696 iops : min=17390, max=18704, avg=17752.11, stdev=208.07, samples=38 00:33:38.696 trim: IOPS=17.7k, BW=69.2MiB/s (72.6MB/s)(693MiB/10001msec); 0 zone resets 00:33:38.696 slat (usec): min=13, max=184, avg=21.99, stdev= 5.00 00:33:38.696 clat (usec): min=51, max=3724, avg=248.16, stdev=74.76 00:33:38.696 lat (usec): min=73, max=3742, avg=270.15, stdev=74.92 00:33:38.696 clat percentiles (usec): 00:33:38.696 | 50.000th=[ 247], 99.000th=[ 404], 99.900th=[ 433], 99.990th=[ 515], 00:33:38.696 | 99.999th=[ 3654] 00:33:38.696 bw ( KiB/s): min=69560, max=74816, per=100.00%, avg=71009.68, stdev=831.37, samples=38 00:33:38.696 iops : min=17390, max=18704, avg=17752.42, stdev=207.84, samples=38 00:33:38.696 lat (usec) : 100=0.24%, 250=44.26%, 500=37.23%, 750=17.60%, 1000=0.66% 00:33:38.696 lat (msec) : 2=0.01%, 4=0.01% 00:33:38.696 cpu : usr=99.30%, sys=0.01%, ctx=32, majf=0, minf=271 00:33:38.696 IO depths : 1=7.5%, 2=17.5%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:38.696 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:38.696 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:38.696 issued rwts: total=0,177292,177294,0 short=0,0,0,0 dropped=0,0,0,0 00:33:38.696 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:38.696 00:33:38.696 Run status group 0 (all jobs): 00:33:38.696 WRITE: bw=69.2MiB/s (72.6MB/s), 69.2MiB/s-69.2MiB/s (72.6MB/s-72.6MB/s), io=693MiB (726MB), run=10001-10001msec 00:33:38.696 TRIM: bw=69.2MiB/s (72.6MB/s), 69.2MiB/s-69.2MiB/s (72.6MB/s-72.6MB/s), io=693MiB (726MB), run=10001-10001msec 00:33:38.696 00:33:38.696 real 0m11.291s 00:33:38.696 user 0m23.900s 00:33:38.696 sys 0m0.363s 00:33:38.696 13:59:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:38.696 13:59:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:38.696 ************************************ 00:33:38.696 END TEST bdev_fio_trim 00:33:38.696 ************************************ 00:33:38.956 13:59:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:38.956 13:59:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:38.956 13:59:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:38.956 13:59:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:38.956 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:38.956 13:59:27 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:38.956 00:33:38.956 real 0m22.929s 00:33:38.956 user 0m48.216s 00:33:38.956 sys 0m0.923s 00:33:38.956 13:59:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:38.956 13:59:27 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:38.956 ************************************ 00:33:38.956 END TEST bdev_fio 00:33:38.956 ************************************ 00:33:38.956 13:59:27 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:38.956 13:59:27 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:38.956 13:59:27 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:38.956 13:59:27 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:38.957 13:59:27 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:38.957 13:59:27 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:38.957 ************************************ 00:33:38.957 START TEST bdev_verify 00:33:38.957 ************************************ 00:33:38.957 13:59:27 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:38.957 [2024-07-12 13:59:27.456994] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:38.957 [2024-07-12 13:59:27.457064] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid629048 ] 00:33:39.216 [2024-07-12 13:59:27.577384] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:39.216 [2024-07-12 13:59:27.687954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:39.216 [2024-07-12 13:59:27.687960] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:39.475 [2024-07-12 13:59:27.869115] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:39.475 [2024-07-12 13:59:27.869190] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:39.475 [2024-07-12 13:59:27.869206] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:39.475 [2024-07-12 13:59:27.877131] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:39.475 [2024-07-12 13:59:27.877150] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:39.475 [2024-07-12 13:59:27.877162] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:39.475 [2024-07-12 13:59:27.885155] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:39.475 [2024-07-12 13:59:27.885173] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:39.475 [2024-07-12 13:59:27.885184] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:39.475 Running I/O for 5 seconds... 00:33:44.746 00:33:44.746 Latency(us) 00:33:44.746 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:44.746 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:44.746 Verification LBA range: start 0x0 length 0x800 00:33:44.746 crypto_ram : 5.01 5980.70 23.36 0.00 0.00 21319.60 1837.86 23592.96 00:33:44.746 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:44.746 Verification LBA range: start 0x800 length 0x800 00:33:44.746 crypto_ram : 5.01 4828.17 18.86 0.00 0.00 26400.57 2051.56 26898.25 00:33:44.746 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:44.746 Verification LBA range: start 0x0 length 0x800 00:33:44.746 crypto_ram3 : 5.02 3006.67 11.74 0.00 0.00 42343.73 2094.30 28493.91 00:33:44.746 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:44.746 Verification LBA range: start 0x800 length 0x800 00:33:44.746 crypto_ram3 : 5.03 2440.79 9.53 0.00 0.00 52116.41 2293.76 33508.84 00:33:44.746 =================================================================================================================== 00:33:44.746 Total : 16256.32 63.50 0.00 0.00 31363.00 1837.86 33508.84 00:33:44.746 00:33:44.746 real 0m5.830s 00:33:44.746 user 0m10.944s 00:33:44.746 sys 0m0.242s 00:33:44.746 13:59:33 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:44.746 13:59:33 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:44.746 ************************************ 00:33:44.746 END TEST bdev_verify 00:33:44.746 ************************************ 00:33:44.746 13:59:33 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:44.746 13:59:33 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:44.746 13:59:33 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:44.746 13:59:33 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:44.746 13:59:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:44.746 ************************************ 00:33:44.746 START TEST bdev_verify_big_io 00:33:44.746 ************************************ 00:33:44.746 13:59:33 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:45.004 [2024-07-12 13:59:33.375346] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:45.004 [2024-07-12 13:59:33.375407] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid629766 ] 00:33:45.004 [2024-07-12 13:59:33.504472] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:45.261 [2024-07-12 13:59:33.603626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:45.261 [2024-07-12 13:59:33.603632] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:45.261 [2024-07-12 13:59:33.768900] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:45.261 [2024-07-12 13:59:33.768973] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:45.261 [2024-07-12 13:59:33.768990] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.261 [2024-07-12 13:59:33.776929] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:45.261 [2024-07-12 13:59:33.776948] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:45.262 [2024-07-12 13:59:33.776960] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.262 [2024-07-12 13:59:33.784969] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:45.262 [2024-07-12 13:59:33.784988] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:45.262 [2024-07-12 13:59:33.784999] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.262 Running I/O for 5 seconds... 00:33:51.820 00:33:51.820 Latency(us) 00:33:51.820 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:51.820 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:51.820 Verification LBA range: start 0x0 length 0x80 00:33:51.820 crypto_ram : 5.03 432.51 27.03 0.00 0.00 288317.25 6810.05 413959.57 00:33:51.820 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:51.820 Verification LBA range: start 0x80 length 0x80 00:33:51.820 crypto_ram : 5.04 355.58 22.22 0.00 0.00 349223.08 7864.32 466844.27 00:33:51.820 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:51.820 Verification LBA range: start 0x0 length 0x80 00:33:51.820 crypto_ram3 : 5.27 242.91 15.18 0.00 0.00 496017.74 6382.64 428548.45 00:33:51.820 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:51.820 Verification LBA range: start 0x80 length 0x80 00:33:51.820 crypto_ram3 : 5.28 193.82 12.11 0.00 0.00 612552.61 7265.95 492374.82 00:33:51.820 =================================================================================================================== 00:33:51.820 Total : 1224.83 76.55 0.00 0.00 401043.16 6382.64 492374.82 00:33:51.820 00:33:51.820 real 0m6.064s 00:33:51.820 user 0m11.432s 00:33:51.820 sys 0m0.237s 00:33:51.820 13:59:39 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:51.820 13:59:39 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:51.820 ************************************ 00:33:51.820 END TEST bdev_verify_big_io 00:33:51.820 ************************************ 00:33:51.820 13:59:39 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:51.820 13:59:39 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:51.820 13:59:39 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:51.821 13:59:39 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:51.821 13:59:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:51.821 ************************************ 00:33:51.821 START TEST bdev_write_zeroes 00:33:51.821 ************************************ 00:33:51.821 13:59:39 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:51.821 [2024-07-12 13:59:39.561905] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:51.821 [2024-07-12 13:59:39.562046] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630643 ] 00:33:51.821 [2024-07-12 13:59:39.760971] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:51.821 [2024-07-12 13:59:39.864670] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:51.821 [2024-07-12 13:59:40.039637] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:51.821 [2024-07-12 13:59:40.039703] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:51.821 [2024-07-12 13:59:40.039719] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:51.821 [2024-07-12 13:59:40.047655] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:51.821 [2024-07-12 13:59:40.047674] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:51.821 [2024-07-12 13:59:40.047686] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:51.821 [2024-07-12 13:59:40.055676] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:51.821 [2024-07-12 13:59:40.055694] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:51.821 [2024-07-12 13:59:40.055705] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:51.821 Running I/O for 1 seconds... 00:33:52.758 00:33:52.758 Latency(us) 00:33:52.758 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:52.758 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:52.758 crypto_ram : 1.01 26478.48 103.43 0.00 0.00 4822.33 2094.30 6610.59 00:33:52.758 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:52.758 crypto_ram3 : 1.01 13268.85 51.83 0.00 0.00 9574.79 3348.03 9915.88 00:33:52.758 =================================================================================================================== 00:33:52.758 Total : 39747.32 155.26 0.00 0.00 6411.53 2094.30 9915.88 00:33:53.017 00:33:53.017 real 0m1.900s 00:33:53.017 user 0m1.587s 00:33:53.017 sys 0m0.288s 00:33:53.017 13:59:41 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:53.017 13:59:41 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:53.017 ************************************ 00:33:53.017 END TEST bdev_write_zeroes 00:33:53.017 ************************************ 00:33:53.017 13:59:41 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:53.017 13:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:53.017 13:59:41 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:53.017 13:59:41 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:53.017 13:59:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:53.017 ************************************ 00:33:53.017 START TEST bdev_json_nonenclosed 00:33:53.017 ************************************ 00:33:53.017 13:59:41 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:53.017 [2024-07-12 13:59:41.504901] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:53.017 [2024-07-12 13:59:41.504968] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630847 ] 00:33:53.276 [2024-07-12 13:59:41.634151] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:53.277 [2024-07-12 13:59:41.730254] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:53.277 [2024-07-12 13:59:41.730325] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:53.277 [2024-07-12 13:59:41.730347] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:53.277 [2024-07-12 13:59:41.730360] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:53.277 00:33:53.277 real 0m0.390s 00:33:53.277 user 0m0.236s 00:33:53.277 sys 0m0.151s 00:33:53.277 13:59:41 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:53.277 13:59:41 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:53.277 13:59:41 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:53.277 ************************************ 00:33:53.277 END TEST bdev_json_nonenclosed 00:33:53.277 ************************************ 00:33:53.536 13:59:41 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:53.536 13:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:33:53.536 13:59:41 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:53.536 13:59:41 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:53.536 13:59:41 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:53.536 13:59:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:53.536 ************************************ 00:33:53.536 START TEST bdev_json_nonarray 00:33:53.536 ************************************ 00:33:53.536 13:59:41 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:53.536 [2024-07-12 13:59:41.970513] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:53.536 [2024-07-12 13:59:41.970576] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid630915 ] 00:33:53.536 [2024-07-12 13:59:42.100224] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:53.795 [2024-07-12 13:59:42.198208] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:53.795 [2024-07-12 13:59:42.198283] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:53.795 [2024-07-12 13:59:42.198304] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:53.795 [2024-07-12 13:59:42.198316] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:53.795 00:33:53.795 real 0m0.391s 00:33:53.795 user 0m0.231s 00:33:53.795 sys 0m0.157s 00:33:53.795 13:59:42 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:53.795 13:59:42 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:53.795 13:59:42 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:53.795 ************************************ 00:33:53.795 END TEST bdev_json_nonarray 00:33:53.795 ************************************ 00:33:53.795 13:59:42 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:53.795 13:59:42 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:33:53.795 13:59:42 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:33:53.796 13:59:42 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:33:53.796 13:59:42 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:33:53.796 13:59:42 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:33:53.796 13:59:42 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:33:53.796 13:59:42 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:53.796 13:59:42 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:54.056 ************************************ 00:33:54.056 START TEST bdev_crypto_enomem 00:33:54.056 ************************************ 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=631056 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 631056 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 631056 ']' 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:54.056 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:54.056 13:59:42 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:54.056 [2024-07-12 13:59:42.450034] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:33:54.056 [2024-07-12 13:59:42.450102] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid631056 ] 00:33:54.056 [2024-07-12 13:59:42.584892] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:54.316 [2024-07-12 13:59:42.704149] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:54.882 true 00:33:54.882 base0 00:33:54.882 true 00:33:54.882 [2024-07-12 13:59:43.410672] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:54.882 crypt0 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:54.882 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:54.882 [ 00:33:54.882 { 00:33:54.882 "name": "crypt0", 00:33:54.882 "aliases": [ 00:33:54.882 "61dc25a3-1d13-548e-bc17-52b7fad9b4f2" 00:33:54.882 ], 00:33:54.882 "product_name": "crypto", 00:33:54.882 "block_size": 512, 00:33:54.882 "num_blocks": 2097152, 00:33:54.882 "uuid": "61dc25a3-1d13-548e-bc17-52b7fad9b4f2", 00:33:54.882 "assigned_rate_limits": { 00:33:54.882 "rw_ios_per_sec": 0, 00:33:54.882 "rw_mbytes_per_sec": 0, 00:33:54.882 "r_mbytes_per_sec": 0, 00:33:54.882 "w_mbytes_per_sec": 0 00:33:54.882 }, 00:33:54.882 "claimed": false, 00:33:54.882 "zoned": false, 00:33:54.882 "supported_io_types": { 00:33:54.882 "read": true, 00:33:54.882 "write": true, 00:33:54.882 "unmap": false, 00:33:54.882 "flush": false, 00:33:54.882 "reset": true, 00:33:54.882 "nvme_admin": false, 00:33:54.882 "nvme_io": false, 00:33:54.882 "nvme_io_md": false, 00:33:54.882 "write_zeroes": true, 00:33:54.882 "zcopy": false, 00:33:54.882 "get_zone_info": false, 00:33:54.882 "zone_management": false, 00:33:54.882 "zone_append": false, 00:33:54.882 "compare": false, 00:33:54.882 "compare_and_write": false, 00:33:54.882 "abort": false, 00:33:54.883 "seek_hole": false, 00:33:54.883 "seek_data": false, 00:33:54.883 "copy": false, 00:33:54.883 "nvme_iov_md": false 00:33:54.883 }, 00:33:54.883 "memory_domains": [ 00:33:54.883 { 00:33:54.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:54.883 "dma_device_type": 2 00:33:54.883 } 00:33:54.883 ], 00:33:54.883 "driver_specific": { 00:33:54.883 "crypto": { 00:33:54.883 "base_bdev_name": "EE_base0", 00:33:54.883 "name": "crypt0", 00:33:54.883 "key_name": "test_dek_sw" 00:33:54.883 } 00:33:54.883 } 00:33:54.883 } 00:33:54.883 ] 00:33:54.883 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:54.883 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:33:54.883 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=631182 00:33:54.883 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:33:54.883 13:59:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:55.140 Running I/O for 5 seconds... 00:33:56.075 13:59:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:33:56.075 13:59:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:56.075 13:59:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:56.075 13:59:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:56.075 13:59:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 631182 00:34:00.325 00:34:00.325 Latency(us) 00:34:00.325 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:00.325 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:34:00.325 crypt0 : 5.00 28296.27 110.53 0.00 0.00 1126.46 541.38 1481.68 00:34:00.325 =================================================================================================================== 00:34:00.325 Total : 28296.27 110.53 0.00 0.00 1126.46 541.38 1481.68 00:34:00.325 0 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 631056 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 631056 ']' 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 631056 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 631056 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 631056' 00:34:00.325 killing process with pid 631056 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 631056 00:34:00.325 Received shutdown signal, test time was about 5.000000 seconds 00:34:00.325 00:34:00.325 Latency(us) 00:34:00.325 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:00.325 =================================================================================================================== 00:34:00.325 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:00.325 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 631056 00:34:00.584 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:34:00.584 00:34:00.584 real 0m6.531s 00:34:00.584 user 0m6.745s 00:34:00.584 sys 0m0.423s 00:34:00.584 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:00.584 13:59:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:00.584 ************************************ 00:34:00.584 END TEST bdev_crypto_enomem 00:34:00.584 ************************************ 00:34:00.584 13:59:48 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:00.584 13:59:48 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:34:00.584 13:59:48 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:34:00.584 13:59:48 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:00.584 13:59:48 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:00.584 13:59:48 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:34:00.584 13:59:48 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:34:00.584 13:59:48 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:34:00.584 13:59:48 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:34:00.584 00:34:00.584 real 0m55.369s 00:34:00.584 user 1m34.937s 00:34:00.584 sys 0m6.778s 00:34:00.584 13:59:48 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:00.584 13:59:48 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:00.584 ************************************ 00:34:00.584 END TEST blockdev_crypto_sw 00:34:00.584 ************************************ 00:34:00.584 13:59:49 -- common/autotest_common.sh@1142 -- # return 0 00:34:00.584 13:59:49 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:00.584 13:59:49 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:00.584 13:59:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:00.584 13:59:49 -- common/autotest_common.sh@10 -- # set +x 00:34:00.584 ************************************ 00:34:00.584 START TEST blockdev_crypto_qat 00:34:00.584 ************************************ 00:34:00.584 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:00.584 * Looking for test storage... 00:34:00.584 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=631987 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:00.843 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 631987 00:34:00.844 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 631987 ']' 00:34:00.844 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:00.844 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:00.844 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:00.844 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:00.844 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:00.844 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:00.844 [2024-07-12 13:59:49.252753] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:34:00.844 [2024-07-12 13:59:49.252829] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid631987 ] 00:34:00.844 [2024-07-12 13:59:49.382791] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:01.102 [2024-07-12 13:59:49.491891] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:01.361 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:01.361 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:34:01.361 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:34:01.361 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:34:01.361 13:59:49 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:34:01.361 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:01.361 13:59:49 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:01.361 [2024-07-12 13:59:49.714844] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:01.361 [2024-07-12 13:59:49.722878] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:01.361 [2024-07-12 13:59:49.730896] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:01.361 [2024-07-12 13:59:49.806771] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:03.896 true 00:34:03.896 true 00:34:03.896 true 00:34:03.896 true 00:34:03.896 Malloc0 00:34:03.896 Malloc1 00:34:03.896 Malloc2 00:34:03.896 Malloc3 00:34:03.896 [2024-07-12 13:59:52.177469] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:03.896 crypto_ram 00:34:03.896 [2024-07-12 13:59:52.185488] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:03.896 crypto_ram1 00:34:03.896 [2024-07-12 13:59:52.193512] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:03.896 crypto_ram2 00:34:03.896 [2024-07-12 13:59:52.201534] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:03.896 crypto_ram3 00:34:03.896 [ 00:34:03.896 { 00:34:03.896 "name": "Malloc1", 00:34:03.896 "aliases": [ 00:34:03.896 "474074ef-2dcc-46c0-afd8-a85c235a27aa" 00:34:03.896 ], 00:34:03.896 "product_name": "Malloc disk", 00:34:03.896 "block_size": 512, 00:34:03.896 "num_blocks": 65536, 00:34:03.896 "uuid": "474074ef-2dcc-46c0-afd8-a85c235a27aa", 00:34:03.896 "assigned_rate_limits": { 00:34:03.896 "rw_ios_per_sec": 0, 00:34:03.896 "rw_mbytes_per_sec": 0, 00:34:03.896 "r_mbytes_per_sec": 0, 00:34:03.896 "w_mbytes_per_sec": 0 00:34:03.896 }, 00:34:03.896 "claimed": true, 00:34:03.896 "claim_type": "exclusive_write", 00:34:03.896 "zoned": false, 00:34:03.896 "supported_io_types": { 00:34:03.896 "read": true, 00:34:03.896 "write": true, 00:34:03.896 "unmap": true, 00:34:03.896 "flush": true, 00:34:03.896 "reset": true, 00:34:03.896 "nvme_admin": false, 00:34:03.896 "nvme_io": false, 00:34:03.896 "nvme_io_md": false, 00:34:03.896 "write_zeroes": true, 00:34:03.896 "zcopy": true, 00:34:03.896 "get_zone_info": false, 00:34:03.896 "zone_management": false, 00:34:03.896 "zone_append": false, 00:34:03.896 "compare": false, 00:34:03.896 "compare_and_write": false, 00:34:03.896 "abort": true, 00:34:03.896 "seek_hole": false, 00:34:03.896 "seek_data": false, 00:34:03.896 "copy": true, 00:34:03.896 "nvme_iov_md": false 00:34:03.896 }, 00:34:03.896 "memory_domains": [ 00:34:03.896 { 00:34:03.896 "dma_device_id": "system", 00:34:03.896 "dma_device_type": 1 00:34:03.896 }, 00:34:03.896 { 00:34:03.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:03.896 "dma_device_type": 2 00:34:03.896 } 00:34:03.896 ], 00:34:03.896 "driver_specific": {} 00:34:03.896 } 00:34:03.896 ] 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.896 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.896 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:34:03.896 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.896 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.896 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.896 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:34:03.896 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:34:03.896 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.896 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.896 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:34:03.896 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:34:03.897 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "5bb97275-952c-5ea5-8d3f-c4c095fed50d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5bb97275-952c-5ea5-8d3f-c4c095fed50d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "10fb1a13-b85c-5126-ab0c-ac7b3b025e7e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "10fb1a13-b85c-5126-ab0c-ac7b3b025e7e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4c6de1c6-651d-570c-8119-3192e45c9dcb"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4c6de1c6-651d-570c-8119-3192e45c9dcb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ea21ef88-4237-52bb-811c-a13194eda7ab"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ea21ef88-4237-52bb-811c-a13194eda7ab",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:04.156 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:34:04.156 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:34:04.156 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:34:04.156 13:59:52 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 631987 00:34:04.156 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 631987 ']' 00:34:04.156 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 631987 00:34:04.156 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:34:04.156 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:04.156 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 631987 00:34:04.156 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:04.156 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:04.156 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 631987' 00:34:04.156 killing process with pid 631987 00:34:04.157 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 631987 00:34:04.157 13:59:52 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 631987 00:34:04.724 13:59:53 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:04.724 13:59:53 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:04.724 13:59:53 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:34:04.724 13:59:53 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:04.724 13:59:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:04.724 ************************************ 00:34:04.724 START TEST bdev_hello_world 00:34:04.724 ************************************ 00:34:04.724 13:59:53 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:04.724 [2024-07-12 13:59:53.197830] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:34:04.724 [2024-07-12 13:59:53.197899] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid632452 ] 00:34:04.983 [2024-07-12 13:59:53.331224] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:04.983 [2024-07-12 13:59:53.435869] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:04.983 [2024-07-12 13:59:53.457222] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:04.983 [2024-07-12 13:59:53.465251] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:04.983 [2024-07-12 13:59:53.473277] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:05.242 [2024-07-12 13:59:53.588950] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:07.775 [2024-07-12 13:59:55.805648] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:07.775 [2024-07-12 13:59:55.805724] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:07.775 [2024-07-12 13:59:55.805740] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.775 [2024-07-12 13:59:55.813664] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:07.775 [2024-07-12 13:59:55.813685] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:07.775 [2024-07-12 13:59:55.813698] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.775 [2024-07-12 13:59:55.821685] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:07.775 [2024-07-12 13:59:55.821704] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:07.775 [2024-07-12 13:59:55.821721] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.775 [2024-07-12 13:59:55.829704] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:07.775 [2024-07-12 13:59:55.829723] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:07.775 [2024-07-12 13:59:55.829735] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.775 [2024-07-12 13:59:55.907178] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:07.775 [2024-07-12 13:59:55.907225] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:07.775 [2024-07-12 13:59:55.907245] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:07.775 [2024-07-12 13:59:55.908527] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:07.775 [2024-07-12 13:59:55.908608] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:07.775 [2024-07-12 13:59:55.908625] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:07.775 [2024-07-12 13:59:55.908670] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:07.775 00:34:07.775 [2024-07-12 13:59:55.908690] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:07.775 00:34:07.775 real 0m3.165s 00:34:07.775 user 0m2.736s 00:34:07.775 sys 0m0.388s 00:34:07.775 13:59:56 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:07.775 13:59:56 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:07.775 ************************************ 00:34:07.775 END TEST bdev_hello_world 00:34:07.775 ************************************ 00:34:07.775 13:59:56 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:07.775 13:59:56 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:34:07.775 13:59:56 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:07.775 13:59:56 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:07.775 13:59:56 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:08.034 ************************************ 00:34:08.034 START TEST bdev_bounds 00:34:08.034 ************************************ 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=632912 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 632912' 00:34:08.034 Process bdevio pid: 632912 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 632912 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 632912 ']' 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:08.034 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:08.034 13:59:56 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:08.034 [2024-07-12 13:59:56.452393] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:34:08.034 [2024-07-12 13:59:56.452474] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid632912 ] 00:34:08.035 [2024-07-12 13:59:56.579424] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:08.294 [2024-07-12 13:59:56.687049] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:08.294 [2024-07-12 13:59:56.690964] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:08.294 [2024-07-12 13:59:56.690966] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:08.294 [2024-07-12 13:59:56.712368] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:08.294 [2024-07-12 13:59:56.720391] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:08.294 [2024-07-12 13:59:56.728410] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:08.294 [2024-07-12 13:59:56.825766] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:10.831 [2024-07-12 13:59:59.022582] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:10.831 [2024-07-12 13:59:59.022659] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:10.831 [2024-07-12 13:59:59.022674] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:10.831 [2024-07-12 13:59:59.030598] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:10.831 [2024-07-12 13:59:59.030618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:10.831 [2024-07-12 13:59:59.030630] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:10.831 [2024-07-12 13:59:59.038622] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:10.831 [2024-07-12 13:59:59.038641] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:10.831 [2024-07-12 13:59:59.038653] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:10.831 [2024-07-12 13:59:59.046642] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:10.831 [2024-07-12 13:59:59.046660] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:10.831 [2024-07-12 13:59:59.046671] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:10.831 13:59:59 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:10.831 13:59:59 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:34:10.831 13:59:59 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:10.831 I/O targets: 00:34:10.831 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:34:10.831 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:34:10.831 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:34:10.831 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:34:10.831 00:34:10.831 00:34:10.831 CUnit - A unit testing framework for C - Version 2.1-3 00:34:10.831 http://cunit.sourceforge.net/ 00:34:10.831 00:34:10.831 00:34:10.831 Suite: bdevio tests on: crypto_ram3 00:34:10.831 Test: blockdev write read block ...passed 00:34:10.831 Test: blockdev write zeroes read block ...passed 00:34:10.831 Test: blockdev write zeroes read no split ...passed 00:34:10.831 Test: blockdev write zeroes read split ...passed 00:34:10.831 Test: blockdev write zeroes read split partial ...passed 00:34:10.831 Test: blockdev reset ...passed 00:34:10.831 Test: blockdev write read 8 blocks ...passed 00:34:10.831 Test: blockdev write read size > 128k ...passed 00:34:10.831 Test: blockdev write read invalid size ...passed 00:34:10.831 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:10.831 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:10.831 Test: blockdev write read max offset ...passed 00:34:10.831 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:10.831 Test: blockdev writev readv 8 blocks ...passed 00:34:10.831 Test: blockdev writev readv 30 x 1block ...passed 00:34:10.831 Test: blockdev writev readv block ...passed 00:34:10.831 Test: blockdev writev readv size > 128k ...passed 00:34:10.831 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:10.831 Test: blockdev comparev and writev ...passed 00:34:10.831 Test: blockdev nvme passthru rw ...passed 00:34:10.831 Test: blockdev nvme passthru vendor specific ...passed 00:34:10.831 Test: blockdev nvme admin passthru ...passed 00:34:10.831 Test: blockdev copy ...passed 00:34:10.831 Suite: bdevio tests on: crypto_ram2 00:34:10.831 Test: blockdev write read block ...passed 00:34:10.831 Test: blockdev write zeroes read block ...passed 00:34:10.831 Test: blockdev write zeroes read no split ...passed 00:34:10.831 Test: blockdev write zeroes read split ...passed 00:34:10.831 Test: blockdev write zeroes read split partial ...passed 00:34:10.831 Test: blockdev reset ...passed 00:34:10.831 Test: blockdev write read 8 blocks ...passed 00:34:10.831 Test: blockdev write read size > 128k ...passed 00:34:10.831 Test: blockdev write read invalid size ...passed 00:34:10.831 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:10.831 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:10.832 Test: blockdev write read max offset ...passed 00:34:10.832 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:10.832 Test: blockdev writev readv 8 blocks ...passed 00:34:10.832 Test: blockdev writev readv 30 x 1block ...passed 00:34:10.832 Test: blockdev writev readv block ...passed 00:34:10.832 Test: blockdev writev readv size > 128k ...passed 00:34:10.832 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:10.832 Test: blockdev comparev and writev ...passed 00:34:10.832 Test: blockdev nvme passthru rw ...passed 00:34:10.832 Test: blockdev nvme passthru vendor specific ...passed 00:34:10.832 Test: blockdev nvme admin passthru ...passed 00:34:10.832 Test: blockdev copy ...passed 00:34:10.832 Suite: bdevio tests on: crypto_ram1 00:34:10.832 Test: blockdev write read block ...passed 00:34:10.832 Test: blockdev write zeroes read block ...passed 00:34:10.832 Test: blockdev write zeroes read no split ...passed 00:34:11.090 Test: blockdev write zeroes read split ...passed 00:34:11.350 Test: blockdev write zeroes read split partial ...passed 00:34:11.350 Test: blockdev reset ...passed 00:34:11.350 Test: blockdev write read 8 blocks ...passed 00:34:11.350 Test: blockdev write read size > 128k ...passed 00:34:11.350 Test: blockdev write read invalid size ...passed 00:34:11.350 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:11.350 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:11.350 Test: blockdev write read max offset ...passed 00:34:11.350 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:11.350 Test: blockdev writev readv 8 blocks ...passed 00:34:11.350 Test: blockdev writev readv 30 x 1block ...passed 00:34:11.350 Test: blockdev writev readv block ...passed 00:34:11.350 Test: blockdev writev readv size > 128k ...passed 00:34:11.350 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:11.350 Test: blockdev comparev and writev ...passed 00:34:11.350 Test: blockdev nvme passthru rw ...passed 00:34:11.350 Test: blockdev nvme passthru vendor specific ...passed 00:34:11.350 Test: blockdev nvme admin passthru ...passed 00:34:11.350 Test: blockdev copy ...passed 00:34:11.350 Suite: bdevio tests on: crypto_ram 00:34:11.350 Test: blockdev write read block ...passed 00:34:11.350 Test: blockdev write zeroes read block ...passed 00:34:11.350 Test: blockdev write zeroes read no split ...passed 00:34:11.350 Test: blockdev write zeroes read split ...passed 00:34:11.609 Test: blockdev write zeroes read split partial ...passed 00:34:11.609 Test: blockdev reset ...passed 00:34:11.609 Test: blockdev write read 8 blocks ...passed 00:34:11.609 Test: blockdev write read size > 128k ...passed 00:34:11.609 Test: blockdev write read invalid size ...passed 00:34:11.609 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:11.609 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:11.609 Test: blockdev write read max offset ...passed 00:34:11.609 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:11.609 Test: blockdev writev readv 8 blocks ...passed 00:34:11.609 Test: blockdev writev readv 30 x 1block ...passed 00:34:11.609 Test: blockdev writev readv block ...passed 00:34:11.609 Test: blockdev writev readv size > 128k ...passed 00:34:11.609 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:11.609 Test: blockdev comparev and writev ...passed 00:34:11.609 Test: blockdev nvme passthru rw ...passed 00:34:11.609 Test: blockdev nvme passthru vendor specific ...passed 00:34:11.609 Test: blockdev nvme admin passthru ...passed 00:34:11.609 Test: blockdev copy ...passed 00:34:11.609 00:34:11.609 Run Summary: Type Total Ran Passed Failed Inactive 00:34:11.609 suites 4 4 n/a 0 0 00:34:11.609 tests 92 92 92 0 0 00:34:11.609 asserts 520 520 520 0 n/a 00:34:11.609 00:34:11.609 Elapsed time = 1.572 seconds 00:34:11.609 0 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 632912 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 632912 ']' 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 632912 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 632912 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 632912' 00:34:11.609 killing process with pid 632912 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 632912 00:34:11.609 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 632912 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:34:12.178 00:34:12.178 real 0m4.102s 00:34:12.178 user 0m11.093s 00:34:12.178 sys 0m0.548s 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:12.178 ************************************ 00:34:12.178 END TEST bdev_bounds 00:34:12.178 ************************************ 00:34:12.178 14:00:00 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:12.178 14:00:00 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:12.178 14:00:00 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:34:12.178 14:00:00 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:12.178 14:00:00 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:12.178 ************************************ 00:34:12.178 START TEST bdev_nbd 00:34:12.178 ************************************ 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:34:12.178 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=633502 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 633502 /var/tmp/spdk-nbd.sock 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 633502 ']' 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:12.179 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:12.179 14:00:00 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:12.179 [2024-07-12 14:00:00.642505] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:34:12.179 [2024-07-12 14:00:00.642570] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:12.438 [2024-07-12 14:00:00.771988] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:12.438 [2024-07-12 14:00:00.873636] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:12.438 [2024-07-12 14:00:00.894921] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:12.438 [2024-07-12 14:00:00.902947] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:12.438 [2024-07-12 14:00:00.910964] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:12.438 [2024-07-12 14:00:01.017108] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:14.972 [2024-07-12 14:00:03.219279] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:14.972 [2024-07-12 14:00:03.219338] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:14.972 [2024-07-12 14:00:03.219353] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.972 [2024-07-12 14:00:03.227298] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:14.973 [2024-07-12 14:00:03.227318] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:14.973 [2024-07-12 14:00:03.227330] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.973 [2024-07-12 14:00:03.235319] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:14.973 [2024-07-12 14:00:03.235337] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:14.973 [2024-07-12 14:00:03.235348] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.973 [2024-07-12 14:00:03.243340] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:14.973 [2024-07-12 14:00:03.243357] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:14.973 [2024-07-12 14:00:03.243368] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:14.973 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:15.231 1+0 records in 00:34:15.231 1+0 records out 00:34:15.231 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324171 s, 12.6 MB/s 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:15.231 14:00:03 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:15.797 1+0 records in 00:34:15.797 1+0 records out 00:34:15.797 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000322075 s, 12.7 MB/s 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:15.797 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:15.798 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:15.798 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:15.798 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:16.056 1+0 records in 00:34:16.056 1+0 records out 00:34:16.056 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282814 s, 14.5 MB/s 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:16.056 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:16.315 1+0 records in 00:34:16.315 1+0 records out 00:34:16.315 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000354428 s, 11.6 MB/s 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:16.315 14:00:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:16.573 { 00:34:16.573 "nbd_device": "/dev/nbd0", 00:34:16.573 "bdev_name": "crypto_ram" 00:34:16.573 }, 00:34:16.573 { 00:34:16.573 "nbd_device": "/dev/nbd1", 00:34:16.573 "bdev_name": "crypto_ram1" 00:34:16.573 }, 00:34:16.573 { 00:34:16.573 "nbd_device": "/dev/nbd2", 00:34:16.573 "bdev_name": "crypto_ram2" 00:34:16.573 }, 00:34:16.573 { 00:34:16.573 "nbd_device": "/dev/nbd3", 00:34:16.573 "bdev_name": "crypto_ram3" 00:34:16.573 } 00:34:16.573 ]' 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:16.573 { 00:34:16.573 "nbd_device": "/dev/nbd0", 00:34:16.573 "bdev_name": "crypto_ram" 00:34:16.573 }, 00:34:16.573 { 00:34:16.573 "nbd_device": "/dev/nbd1", 00:34:16.573 "bdev_name": "crypto_ram1" 00:34:16.573 }, 00:34:16.573 { 00:34:16.573 "nbd_device": "/dev/nbd2", 00:34:16.573 "bdev_name": "crypto_ram2" 00:34:16.573 }, 00:34:16.573 { 00:34:16.573 "nbd_device": "/dev/nbd3", 00:34:16.573 "bdev_name": "crypto_ram3" 00:34:16.573 } 00:34:16.573 ]' 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:16.573 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:16.831 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:16.831 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:16.831 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:16.831 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:16.831 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:16.831 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:16.831 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:16.831 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:16.831 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:16.831 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:17.090 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:17.090 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:17.090 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:17.090 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:17.090 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:17.090 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:17.090 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:17.090 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:17.090 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:17.090 14:00:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:34:17.658 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:34:17.658 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:34:17.658 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:34:17.658 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:17.658 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:17.658 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:34:17.658 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:17.658 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:17.658 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:17.658 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:17.915 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:18.173 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:18.173 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:18.173 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:18.173 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:18.173 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:18.173 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:18.173 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:18.173 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:18.173 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:18.174 14:00:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:34:18.739 /dev/nbd0 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:18.739 1+0 records in 00:34:18.739 1+0 records out 00:34:18.739 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000355201 s, 11.5 MB/s 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:18.739 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:34:18.997 /dev/nbd1 00:34:18.997 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:19.258 1+0 records in 00:34:19.258 1+0 records out 00:34:19.258 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317015 s, 12.9 MB/s 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:19.258 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:34:19.517 /dev/nbd10 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:19.517 1+0 records in 00:34:19.517 1+0 records out 00:34:19.517 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333528 s, 12.3 MB/s 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:19.517 14:00:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:34:19.775 /dev/nbd11 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:19.775 1+0 records in 00:34:19.775 1+0 records out 00:34:19.775 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279636 s, 14.6 MB/s 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:19.775 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:19.776 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:34:20.034 { 00:34:20.034 "nbd_device": "/dev/nbd0", 00:34:20.034 "bdev_name": "crypto_ram" 00:34:20.034 }, 00:34:20.034 { 00:34:20.034 "nbd_device": "/dev/nbd1", 00:34:20.034 "bdev_name": "crypto_ram1" 00:34:20.034 }, 00:34:20.034 { 00:34:20.034 "nbd_device": "/dev/nbd10", 00:34:20.034 "bdev_name": "crypto_ram2" 00:34:20.034 }, 00:34:20.034 { 00:34:20.034 "nbd_device": "/dev/nbd11", 00:34:20.034 "bdev_name": "crypto_ram3" 00:34:20.034 } 00:34:20.034 ]' 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:34:20.034 { 00:34:20.034 "nbd_device": "/dev/nbd0", 00:34:20.034 "bdev_name": "crypto_ram" 00:34:20.034 }, 00:34:20.034 { 00:34:20.034 "nbd_device": "/dev/nbd1", 00:34:20.034 "bdev_name": "crypto_ram1" 00:34:20.034 }, 00:34:20.034 { 00:34:20.034 "nbd_device": "/dev/nbd10", 00:34:20.034 "bdev_name": "crypto_ram2" 00:34:20.034 }, 00:34:20.034 { 00:34:20.034 "nbd_device": "/dev/nbd11", 00:34:20.034 "bdev_name": "crypto_ram3" 00:34:20.034 } 00:34:20.034 ]' 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:34:20.034 /dev/nbd1 00:34:20.034 /dev/nbd10 00:34:20.034 /dev/nbd11' 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:34:20.034 /dev/nbd1 00:34:20.034 /dev/nbd10 00:34:20.034 /dev/nbd11' 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:34:20.034 256+0 records in 00:34:20.034 256+0 records out 00:34:20.034 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010222 s, 103 MB/s 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:34:20.034 256+0 records in 00:34:20.034 256+0 records out 00:34:20.034 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0827061 s, 12.7 MB/s 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:20.034 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:34:20.293 256+0 records in 00:34:20.293 256+0 records out 00:34:20.293 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0658186 s, 15.9 MB/s 00:34:20.293 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:20.293 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:34:20.293 256+0 records in 00:34:20.293 256+0 records out 00:34:20.293 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0585374 s, 17.9 MB/s 00:34:20.293 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:20.293 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:34:20.293 256+0 records in 00:34:20.293 256+0 records out 00:34:20.293 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0559777 s, 18.7 MB/s 00:34:20.293 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:34:20.293 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:20.293 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:20.293 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:34:20.293 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:20.294 14:00:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:20.860 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:20.860 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:20.860 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:20.860 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:20.860 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:20.860 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:20.860 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:20.860 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:20.860 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:20.860 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:21.119 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:21.119 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:21.119 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:21.119 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:21.119 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:21.119 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:21.119 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:21.119 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:21.119 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:21.119 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:34:21.390 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:34:21.390 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:34:21.390 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:34:21.390 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:21.390 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:21.390 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:34:21.390 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:21.390 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:21.390 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:21.390 14:00:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:21.649 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:21.908 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:21.908 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:21.908 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:34:22.165 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:34:22.424 malloc_lvol_verify 00:34:22.424 14:00:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:34:22.683 64c039b4-5ff9-4b30-a84f-cae4a5087619 00:34:22.683 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:34:22.941 a2f8a610-8c56-4ce4-bd8e-cab5808ca4cf 00:34:22.941 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:34:23.199 /dev/nbd0 00:34:23.199 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:34:23.199 mke2fs 1.46.5 (30-Dec-2021) 00:34:23.199 Discarding device blocks: 0/4096 done 00:34:23.199 Creating filesystem with 4096 1k blocks and 1024 inodes 00:34:23.199 00:34:23.199 Allocating group tables: 0/1 done 00:34:23.199 Writing inode tables: 0/1 done 00:34:23.199 Creating journal (1024 blocks): done 00:34:23.199 Writing superblocks and filesystem accounting information: 0/1 done 00:34:23.199 00:34:23.199 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:34:23.199 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:34:23.199 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:23.199 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:23.199 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:23.199 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:23.199 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:23.199 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 633502 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 633502 ']' 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 633502 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 633502 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 633502' 00:34:23.457 killing process with pid 633502 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 633502 00:34:23.457 14:00:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 633502 00:34:24.024 14:00:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:34:24.024 00:34:24.024 real 0m11.892s 00:34:24.024 user 0m15.810s 00:34:24.024 sys 0m4.639s 00:34:24.024 14:00:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:24.024 14:00:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:24.025 ************************************ 00:34:24.025 END TEST bdev_nbd 00:34:24.025 ************************************ 00:34:24.025 14:00:12 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:24.025 14:00:12 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:34:24.025 14:00:12 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:34:24.025 14:00:12 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:34:24.025 14:00:12 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:34:24.025 14:00:12 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:24.025 14:00:12 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:24.025 14:00:12 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:24.025 ************************************ 00:34:24.025 START TEST bdev_fio 00:34:24.025 ************************************ 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:24.025 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:34:24.025 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:24.283 14:00:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:24.283 ************************************ 00:34:24.283 START TEST bdev_fio_rw_verify 00:34:24.283 ************************************ 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:24.284 14:00:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:24.542 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:24.542 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:24.542 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:24.542 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:24.542 fio-3.35 00:34:24.542 Starting 4 threads 00:34:39.473 00:34:39.473 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=636176: Fri Jul 12 14:00:25 2024 00:34:39.473 read: IOPS=20.2k, BW=79.0MiB/s (82.8MB/s)(790MiB/10001msec) 00:34:39.473 slat (usec): min=17, max=416, avg=67.74, stdev=30.96 00:34:39.473 clat (usec): min=16, max=1703, avg=376.29, stdev=210.95 00:34:39.473 lat (usec): min=34, max=1864, avg=444.03, stdev=223.06 00:34:39.473 clat percentiles (usec): 00:34:39.473 | 50.000th=[ 330], 99.000th=[ 930], 99.900th=[ 1074], 99.990th=[ 1287], 00:34:39.473 | 99.999th=[ 1434] 00:34:39.473 write: IOPS=22.2k, BW=86.8MiB/s (91.0MB/s)(847MiB/9766msec); 0 zone resets 00:34:39.473 slat (usec): min=24, max=1247, avg=79.36, stdev=28.60 00:34:39.473 clat (usec): min=21, max=1895, avg=418.56, stdev=220.80 00:34:39.473 lat (usec): min=61, max=2085, avg=497.92, stdev=230.30 00:34:39.473 clat percentiles (usec): 00:34:39.473 | 50.000th=[ 383], 99.000th=[ 996], 99.900th=[ 1123], 99.990th=[ 1401], 00:34:39.473 | 99.999th=[ 1844] 00:34:39.473 bw ( KiB/s): min=71720, max=122804, per=97.78%, avg=86869.26, stdev=2926.12, samples=76 00:34:39.474 iops : min=17930, max=30700, avg=21717.26, stdev=731.50, samples=76 00:34:39.474 lat (usec) : 20=0.01%, 50=0.01%, 100=1.86%, 250=29.05%, 500=39.61% 00:34:39.474 lat (usec) : 750=21.59%, 1000=7.24% 00:34:39.474 lat (msec) : 2=0.66% 00:34:39.474 cpu : usr=99.57%, sys=0.01%, ctx=111, majf=0, minf=277 00:34:39.474 IO depths : 1=6.0%, 2=26.9%, 4=53.7%, 8=13.4%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:39.474 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:39.474 complete : 0=0.0%, 4=88.2%, 8=11.8%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:39.474 issued rwts: total=202148,216914,0,0 short=0,0,0,0 dropped=0,0,0,0 00:34:39.474 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:39.474 00:34:39.474 Run status group 0 (all jobs): 00:34:39.474 READ: bw=79.0MiB/s (82.8MB/s), 79.0MiB/s-79.0MiB/s (82.8MB/s-82.8MB/s), io=790MiB (828MB), run=10001-10001msec 00:34:39.474 WRITE: bw=86.8MiB/s (91.0MB/s), 86.8MiB/s-86.8MiB/s (91.0MB/s-91.0MB/s), io=847MiB (888MB), run=9766-9766msec 00:34:39.474 00:34:39.474 real 0m13.566s 00:34:39.474 user 0m46.102s 00:34:39.474 sys 0m0.521s 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:34:39.474 ************************************ 00:34:39.474 END TEST bdev_fio_rw_verify 00:34:39.474 ************************************ 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "5bb97275-952c-5ea5-8d3f-c4c095fed50d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5bb97275-952c-5ea5-8d3f-c4c095fed50d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "10fb1a13-b85c-5126-ab0c-ac7b3b025e7e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "10fb1a13-b85c-5126-ab0c-ac7b3b025e7e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4c6de1c6-651d-570c-8119-3192e45c9dcb"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4c6de1c6-651d-570c-8119-3192e45c9dcb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ea21ef88-4237-52bb-811c-a13194eda7ab"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ea21ef88-4237-52bb-811c-a13194eda7ab",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:34:39.474 crypto_ram1 00:34:39.474 crypto_ram2 00:34:39.474 crypto_ram3 ]] 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "5bb97275-952c-5ea5-8d3f-c4c095fed50d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5bb97275-952c-5ea5-8d3f-c4c095fed50d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "10fb1a13-b85c-5126-ab0c-ac7b3b025e7e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "10fb1a13-b85c-5126-ab0c-ac7b3b025e7e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "4c6de1c6-651d-570c-8119-3192e45c9dcb"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4c6de1c6-651d-570c-8119-3192e45c9dcb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ea21ef88-4237-52bb-811c-a13194eda7ab"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ea21ef88-4237-52bb-811c-a13194eda7ab",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:39.474 ************************************ 00:34:39.474 START TEST bdev_fio_trim 00:34:39.474 ************************************ 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:39.474 14:00:26 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:39.474 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:39.474 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:39.474 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:39.474 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:39.474 fio-3.35 00:34:39.474 Starting 4 threads 00:34:51.696 00:34:51.696 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=638017: Fri Jul 12 14:00:39 2024 00:34:51.696 write: IOPS=34.6k, BW=135MiB/s (142MB/s)(1350MiB/10001msec); 0 zone resets 00:34:51.696 slat (usec): min=11, max=1157, avg=65.77, stdev=25.44 00:34:51.696 clat (usec): min=24, max=1477, avg=240.33, stdev=110.61 00:34:51.696 lat (usec): min=41, max=1539, avg=306.10, stdev=121.24 00:34:51.696 clat percentiles (usec): 00:34:51.696 | 50.000th=[ 227], 99.000th=[ 502], 99.900th=[ 668], 99.990th=[ 824], 00:34:51.696 | 99.999th=[ 1172] 00:34:51.696 bw ( KiB/s): min=121952, max=188245, per=99.78%, avg=137941.32, stdev=4345.64, samples=76 00:34:51.696 iops : min=30488, max=47061, avg=34485.32, stdev=1086.39, samples=76 00:34:51.696 trim: IOPS=34.6k, BW=135MiB/s (142MB/s)(1350MiB/10001msec); 0 zone resets 00:34:51.696 slat (usec): min=4, max=105, avg=19.85, stdev= 8.53 00:34:51.696 clat (usec): min=27, max=1540, avg=306.28, stdev=121.25 00:34:51.696 lat (usec): min=38, max=1570, avg=326.13, stdev=123.23 00:34:51.696 clat percentiles (usec): 00:34:51.696 | 50.000th=[ 297], 99.000th=[ 594], 99.900th=[ 799], 99.990th=[ 979], 00:34:51.696 | 99.999th=[ 1385] 00:34:51.696 bw ( KiB/s): min=121952, max=188245, per=99.78%, avg=137941.32, stdev=4345.64, samples=76 00:34:51.696 iops : min=30488, max=47061, avg=34485.32, stdev=1086.39, samples=76 00:34:51.696 lat (usec) : 50=0.24%, 100=4.77%, 250=41.56%, 500=49.93%, 750=3.41% 00:34:51.696 lat (usec) : 1000=0.09% 00:34:51.696 lat (msec) : 2=0.01% 00:34:51.696 cpu : usr=99.61%, sys=0.00%, ctx=106, majf=0, minf=106 00:34:51.696 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:51.696 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:51.696 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:51.696 issued rwts: total=0,345647,345648,0 short=0,0,0,0 dropped=0,0,0,0 00:34:51.696 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:51.696 00:34:51.696 Run status group 0 (all jobs): 00:34:51.696 WRITE: bw=135MiB/s (142MB/s), 135MiB/s-135MiB/s (142MB/s-142MB/s), io=1350MiB (1416MB), run=10001-10001msec 00:34:51.696 TRIM: bw=135MiB/s (142MB/s), 135MiB/s-135MiB/s (142MB/s-142MB/s), io=1350MiB (1416MB), run=10001-10001msec 00:34:51.696 00:34:51.696 real 0m13.545s 00:34:51.696 user 0m45.996s 00:34:51.696 sys 0m0.487s 00:34:51.696 14:00:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:51.696 14:00:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:51.696 ************************************ 00:34:51.696 END TEST bdev_fio_trim 00:34:51.696 ************************************ 00:34:51.696 14:00:40 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:51.696 14:00:40 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:34:51.696 14:00:40 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:51.696 14:00:40 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:34:51.696 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:51.696 14:00:40 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:34:51.696 00:34:51.696 real 0m27.472s 00:34:51.696 user 1m32.296s 00:34:51.696 sys 0m1.194s 00:34:51.696 14:00:40 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:51.696 14:00:40 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:51.696 ************************************ 00:34:51.696 END TEST bdev_fio 00:34:51.696 ************************************ 00:34:51.696 14:00:40 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:51.696 14:00:40 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:51.696 14:00:40 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:51.696 14:00:40 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:51.696 14:00:40 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:51.696 14:00:40 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:51.696 ************************************ 00:34:51.696 START TEST bdev_verify 00:34:51.696 ************************************ 00:34:51.696 14:00:40 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:51.696 [2024-07-12 14:00:40.174812] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:34:51.696 [2024-07-12 14:00:40.174878] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639290 ] 00:34:51.954 [2024-07-12 14:00:40.301469] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:51.954 [2024-07-12 14:00:40.400085] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:51.954 [2024-07-12 14:00:40.400091] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:51.954 [2024-07-12 14:00:40.421439] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:51.954 [2024-07-12 14:00:40.429466] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:51.954 [2024-07-12 14:00:40.437489] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:52.212 [2024-07-12 14:00:40.550207] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:54.765 [2024-07-12 14:00:42.750919] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:54.765 [2024-07-12 14:00:42.751002] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:54.765 [2024-07-12 14:00:42.751017] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:54.765 [2024-07-12 14:00:42.758944] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:54.765 [2024-07-12 14:00:42.758963] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:54.765 [2024-07-12 14:00:42.758976] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:54.765 [2024-07-12 14:00:42.766968] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:54.765 [2024-07-12 14:00:42.766985] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:54.765 [2024-07-12 14:00:42.766996] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:54.765 [2024-07-12 14:00:42.774985] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:54.765 [2024-07-12 14:00:42.775002] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:54.765 [2024-07-12 14:00:42.775013] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:54.765 Running I/O for 5 seconds... 00:35:00.048 00:35:00.048 Latency(us) 00:35:00.048 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:00.048 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:00.048 Verification LBA range: start 0x0 length 0x1000 00:35:00.048 crypto_ram : 5.08 478.76 1.87 0.00 0.00 266829.62 4302.58 163213.13 00:35:00.048 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:00.048 Verification LBA range: start 0x1000 length 0x1000 00:35:00.048 crypto_ram : 5.08 385.90 1.51 0.00 0.00 327472.00 1488.81 181449.24 00:35:00.048 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:00.048 Verification LBA range: start 0x0 length 0x1000 00:35:00.048 crypto_ram1 : 5.08 478.50 1.87 0.00 0.00 266033.35 4644.51 151359.67 00:35:00.048 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:00.048 Verification LBA range: start 0x1000 length 0x1000 00:35:00.048 crypto_ram1 : 5.08 381.47 1.49 0.00 0.00 333881.43 1909.09 204244.37 00:35:00.048 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:00.048 Verification LBA range: start 0x0 length 0x1000 00:35:00.048 crypto_ram2 : 5.06 3693.15 14.43 0.00 0.00 34353.51 7351.43 27696.08 00:35:00.048 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:00.048 Verification LBA range: start 0x1000 length 0x1000 00:35:00.048 crypto_ram2 : 5.06 2984.78 11.66 0.00 0.00 42602.00 6411.13 45362.31 00:35:00.048 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:00.048 Verification LBA range: start 0x0 length 0x1000 00:35:00.048 crypto_ram3 : 5.06 3690.10 14.41 0.00 0.00 34289.51 6325.65 27582.11 00:35:00.048 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:00.048 Verification LBA range: start 0x1000 length 0x1000 00:35:00.048 crypto_ram3 : 5.06 2982.11 11.65 0.00 0.00 42498.40 7864.32 35560.40 00:35:00.048 =================================================================================================================== 00:35:00.048 Total : 15074.77 58.89 0.00 0.00 67488.21 1488.81 204244.37 00:35:00.048 00:35:00.048 real 0m8.280s 00:35:00.048 user 0m15.688s 00:35:00.048 sys 0m0.377s 00:35:00.048 14:00:48 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:00.048 14:00:48 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:35:00.048 ************************************ 00:35:00.048 END TEST bdev_verify 00:35:00.048 ************************************ 00:35:00.048 14:00:48 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:00.048 14:00:48 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:00.048 14:00:48 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:00.048 14:00:48 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:00.048 14:00:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:00.048 ************************************ 00:35:00.048 START TEST bdev_verify_big_io 00:35:00.048 ************************************ 00:35:00.048 14:00:48 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:00.048 [2024-07-12 14:00:48.541062] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:00.048 [2024-07-12 14:00:48.541149] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid640354 ] 00:35:00.307 [2024-07-12 14:00:48.684984] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:00.307 [2024-07-12 14:00:48.795750] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:00.307 [2024-07-12 14:00:48.795755] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:00.307 [2024-07-12 14:00:48.817132] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:00.307 [2024-07-12 14:00:48.825159] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:00.307 [2024-07-12 14:00:48.833184] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:00.566 [2024-07-12 14:00:48.937050] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:03.097 [2024-07-12 14:00:51.153695] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:03.097 [2024-07-12 14:00:51.153781] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:03.097 [2024-07-12 14:00:51.153796] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:03.097 [2024-07-12 14:00:51.161715] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:03.097 [2024-07-12 14:00:51.161734] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:03.097 [2024-07-12 14:00:51.161746] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:03.097 [2024-07-12 14:00:51.169735] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:03.097 [2024-07-12 14:00:51.169752] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:03.097 [2024-07-12 14:00:51.169763] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:03.097 [2024-07-12 14:00:51.177757] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:03.097 [2024-07-12 14:00:51.177774] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:03.097 [2024-07-12 14:00:51.177790] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:03.097 Running I/O for 5 seconds... 00:35:03.664 [2024-07-12 14:00:52.165934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.166497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.166591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.166650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.166703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.166754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.167307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.167330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.171804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.171884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.171947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.172001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.172594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.172651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.172704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.172756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.173296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.173318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.177729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.177818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.177871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.177924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.178475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.178531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.178599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.178652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.664 [2024-07-12 14:00:52.179223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.179247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.183665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.183727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.183786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.183838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.184420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.184476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.184528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.184583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.185123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.185145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.189421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.189480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.189570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.189634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.190196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.190253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.190306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.190368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.190933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.190956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.195253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.195314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.195367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.195420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.196012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.196067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.196120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.196173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.196731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.196753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.200797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.200855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.200924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.200997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.201660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.201727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.201780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.201832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.202393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.202416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.206546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.206605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.206658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.206712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.207237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.207292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.207344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.207395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.207923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.207951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.212005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.212063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.212125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.212177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.212734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.212823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.212883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.212940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.213491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.213513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.217671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.217730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.217782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.217839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.218387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.218443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.218496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.218550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.219098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.219120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.221875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.221938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.221990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.222053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.222453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.222508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.222560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.222611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.223088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.223110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.226880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.226944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.226996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.227047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.227437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.227492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.227543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.227600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.228082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.228104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.231452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.231510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.231561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.231617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.232204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.232260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.665 [2024-07-12 14:00:52.232313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.232365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.232710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.232731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.235395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.235453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.235510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.235565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.235964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.236021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.236073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.236141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.236748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.236770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.240074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.240134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.240185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.240236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.240747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.240812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.240863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.240914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.241293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.666 [2024-07-12 14:00:52.241315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.244843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.244900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.244962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.245015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.245538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.245592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.245643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.245694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.246073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.246094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.248734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.248790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.248842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.248894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.249477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.249532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.249585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.249642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.250174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.250196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.253401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.253458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.253516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.253569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.254030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.254086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.254139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.254197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.254543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.254563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.258011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.258071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.258124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.258177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.258564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.258632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.258684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.258739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.259086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.259106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.261704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.261766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.261818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.261870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.262475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.262529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.262581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.262633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.263163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.263184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.266095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.266161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.266214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.266265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.266656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.266722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.266774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.266828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.267171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.267193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.270648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.270706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.270760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.270813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.271213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.271276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.271332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.271383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.271727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.271748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.274251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.274307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.274365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.274419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.275014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.275071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.275123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.275176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.275697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.275719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.278856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.278913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.278978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.279030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.279488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.279543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.279597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.279653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.928 [2024-07-12 14:00:52.280002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.280035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.283315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.283372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.283424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.283476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.283914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.283974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.284038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.284096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.284440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.284460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.287011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.287068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.287119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.287169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.287731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.287785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.287837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.287890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.288427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.288449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.291398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.291455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.291507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.291557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.292071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.292126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.292177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.292229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.292638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.292659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.295889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.295952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.296007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.296061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.296600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.296654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.296714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.296766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.297198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.297219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.299616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.299672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.299738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.299792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.300213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.300270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.300322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.300390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.300986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.301008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.304043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.304100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.304160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.304215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.304618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.304674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.304725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.304782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.305135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.305157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.308088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.308145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.308197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.308248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.308835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.308890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.308949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.309007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.309360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.309382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.311777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.311833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.311892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.311938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.312321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.312375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.312427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.312478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.312941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.312963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.318232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.320317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.321564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.323408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.325835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.326344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.326838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.327335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.327875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.327897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.332168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.334275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.335424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.335919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.336931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.338650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.340551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.342653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.929 [2024-07-12 14:00:52.343036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.343060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.346002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.346509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.347625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.349457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.351906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.353135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.355045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.357121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.357471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.357492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.362586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.364717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.366141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.368003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.370441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.371846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.372356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.372850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.373399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.373425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.377571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.379659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.381664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.382168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.383210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.384043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.385879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.387966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.388320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.388341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.391283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.391780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.392278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.394164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.396602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.398031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.399902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.401999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.402345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.402366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.407486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.409506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.411585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.412977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.415482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.417572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.418080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.418574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.419191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.419214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.423589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.425619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.427714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.428835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.429932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.430427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.432414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.434442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.434788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.434814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.437536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.438040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.438533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.440015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.442498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.444589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.446076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.447894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.448245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.448267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.453255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.455251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.457333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.458583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.460920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.463017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.464273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.464768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.465341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.465363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.469424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.471294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.473379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.475469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.476517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.477018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.477511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.479344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.479690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.479717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.483933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.484435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.484946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.485441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.487685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.489761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.491764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.493554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.930 [2024-07-12 14:00:52.493983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.931 [2024-07-12 14:00:52.494005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.931 [2024-07-12 14:00:52.497546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.931 [2024-07-12 14:00:52.499390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.931 [2024-07-12 14:00:52.501394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.931 [2024-07-12 14:00:52.502666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.931 [2024-07-12 14:00:52.504826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.931 [2024-07-12 14:00:52.506833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.507336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.507829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.508445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.508469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.512075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.512583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.513094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.513590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.514671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.515180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.515675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.516174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.516712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.516734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.520325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.520831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.192 [2024-07-12 14:00:52.521335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.521828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.522862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.523370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.523866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.524364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.524914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.524944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.528572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.529082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.529582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.530080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.531130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.531630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.532132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.532631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.533154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.533176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.536828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.537339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.537835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.538334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.539386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.539888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.540391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.540887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.541419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.541440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.545092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.545594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.546096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.546589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.547603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.548106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.548599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.549097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.549567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.549588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.553192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.553706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.554216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.554727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.555768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.556271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.556763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.557260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.557735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.557756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.561436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.561943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.562440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.562938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.563933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.564435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.564933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.565428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.565989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.566010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.569792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.570304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.570797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.571296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.572267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.572766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.573263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.573758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.574266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.574288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.578039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.578538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.579035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.579529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.580477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.580983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.581475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.581975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.582441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.582462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.586258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.586757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.587253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.587747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.588757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.193 [2024-07-12 14:00:52.589261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.589754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.590253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.590731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.590752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.595755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.597311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.597906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.598408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.599467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.601301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.603392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.605486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.605958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.605980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.608754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.609255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.610787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.612636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.615053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.616669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.618507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.620582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.620940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.620962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.625732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.627736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.629477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.631468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.633836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.635565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.636061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.636553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.637093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.637116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.641258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.643354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.645445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.646258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.647313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.647809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.649644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.651732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.652084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.652106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.654722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.655228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.655743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.657148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.659668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.661746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.663184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.665028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.665377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.665398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.669571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.671403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.673412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.675177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.677362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.679373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.681100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.681595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.682129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.682152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.686418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.688384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.690390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.692490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.693396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.693892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.694393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.696391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.696741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.696761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.700991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.701616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.702120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.702170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.703582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.705427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.194 [2024-07-12 14:00:52.707520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.709612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.710091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.710112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.712826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.713331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.714910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.716758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.716826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.717177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.719034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.721026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.722973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.725062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.725468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.725489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.728493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.728563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.728630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.728685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.729032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.729105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.729180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.729232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.729284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.729619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.729640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.731789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.731847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.731901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.731964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.732563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.732628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.732682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.732735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.732788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.733299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.733320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.735584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.735641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.735692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.735743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.736220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.736292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.736345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.736396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.736447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.736848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.736879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.739528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.739584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.739637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.739692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.740243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.740308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.740373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.740428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.740479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.740853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.740875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.743040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.743097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.743149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.743207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.743548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.743628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.743681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.743733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.743784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.744242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.744264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.747191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.747254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.747306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.747357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.747697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.747769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.747822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.747878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.747936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.748280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.748301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.750438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.750495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.750565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.195 [2024-07-12 14:00:52.750617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.751224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.751290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.751343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.751395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.751450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.751992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.752014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.754272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.754332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.754397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.754451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.754850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.754918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.754987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.755045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.755099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.755443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.755464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.758260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.758318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.758372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.758424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.759004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.759086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.759140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.759192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.759243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.759620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.759641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.761740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.761798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.761877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.761937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.762280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.762352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.762410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.762461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.762513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.762974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.762997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.765799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.765856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.765908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.765966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.766305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.766376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.766429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.766485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.766546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.766893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.766914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.769056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.769117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.769180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.769231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.769788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.769856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.769911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.769976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.770029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.770639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.196 [2024-07-12 14:00:52.770660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.772818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.772884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.772944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.772997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.773340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.773415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.773479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.773535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.773586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.773937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.773958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.776626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.776685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.776737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.776790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.777267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.777334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.777386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.777438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.777489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.777881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.777902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.779953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.780012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.780064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.780116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.780453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.780523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.780575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.780626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.780685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.781246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.781269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.784025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.784082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.784138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.784190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.784532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.784603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.784665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.784716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.784769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.785188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.785212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.787317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.787374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.787425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.787477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.788026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.788095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.788149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.788219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.459 [2024-07-12 14:00:52.788272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.788868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.788891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.791018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.791077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.791130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.791184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.791530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.791601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.791656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.791708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.791759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.792108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.792129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.794846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.794904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.794965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.795019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.795432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.795499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.795551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.795602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.795661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.796011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.796033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.798128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.798185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.798240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.798292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.798631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.798703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.798771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.798825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.798877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.799422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.799444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.802134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.802192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.802251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.802308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.802648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.802724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.802777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.802828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.802879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.803400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.803422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.805572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.805630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.805682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.805735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.806296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.806363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.806416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.806468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.806520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.807083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.807106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.809217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.809274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.809330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.809382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.809778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.809850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.809902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.809961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.810012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.810354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.810375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.813143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.813200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.813259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.813312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.813689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.813756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.813808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.813866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.813937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.814283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.814304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.816399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.816456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.816507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.816559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.816898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.816983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.817040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.817091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.817143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.817716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.817739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.820359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.820428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.820484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.820536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.820875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.820953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.821006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.821058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.821109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.821626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.821646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.823893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.823962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.824017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.824069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.824586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.824651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.824704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.824755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.824807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.825363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.825386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.827486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.827546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.827602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.827654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.828047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.828120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.828172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.828224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.828275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.828617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.828642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.831452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.831510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.831563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.831617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.831962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.832043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.832099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.832156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.832208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.832552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.832573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.834698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.834756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.834814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.834870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.835215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.835287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.835342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.835394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.835460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.836074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.836098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.838690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.838746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.840768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.840826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.841344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.841419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.841471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.841527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.841579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.841977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.841998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.844484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.844548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.844600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.845100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.845491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.845561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.845613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.845672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.845726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.846076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.846098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.850209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.851113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.851609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.852106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.852649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.854658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.856768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.857999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.859821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.860179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.860201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.863499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.864010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.864508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.865006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.865551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.866093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.866605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.460 [2024-07-12 14:00:52.867106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.867599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.868095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.868117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.871584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.872096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.872593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.873091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.873632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.874150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.874651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.875150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.875645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.876133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.876156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.879595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.880110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.880607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.881107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.881646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.882161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.882661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.883161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.883654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.884143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.884170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.887593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.888121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.888623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.889120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.889671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.890191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.890691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.891190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.891684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.892177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.892200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.895637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.896171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.896669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.897169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.897711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.898230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.898730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.899231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.899728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.900187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.900210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.903628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.904154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.904651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.905152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.905684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.906206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.906707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.907207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.907700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.908191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.908213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.911630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.912150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.912646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.913147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.913680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.914199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.914702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.915201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.915698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.916163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.916186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.919565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.920077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.920573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.921070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.921621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.922140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.922642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.923139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.923633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.924142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.924165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.927577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.928089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.928589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.929088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.929648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.930167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.930666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.931165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.931658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.932155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.932178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.935581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.936091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.936587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.937085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.937639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.938158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.938661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.939162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.939655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.940155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.940177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.944091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.945230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.947235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.947732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.948276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.948781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.949286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.949783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.950295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.950782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.950803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.954875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.956885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.958907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.961004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.961419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.961942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.962444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.962942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.964891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.965240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.965262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.969405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.970193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.970690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.971191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.971735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.973682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.975763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.977847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.979059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.979437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.979459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.982404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.983209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.985039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.987103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.987450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.988698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.990522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.992613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.994709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.995199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.995221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:52.999997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.002095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.003325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.005153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.005502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.007604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.008251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.008757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.009278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.009767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.009789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.013686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.015779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.017783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.018288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.018836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.019353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.020428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.022267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.024350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.024693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.024714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.461 [2024-07-12 14:00:53.027291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.462 [2024-07-12 14:00:53.027789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.462 [2024-07-12 14:00:53.028286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.462 [2024-07-12 14:00:53.030207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.462 [2024-07-12 14:00:53.030567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.462 [2024-07-12 14:00:53.032664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.462 [2024-07-12 14:00:53.034473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.462 [2024-07-12 14:00:53.036471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.038471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.038815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.038836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.043014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.044859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.046860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.048587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.048944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.050777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.052784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.054441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.054955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.055459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.055480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.059559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.061570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.063618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.065714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.066146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.066661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.067165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.067662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.069667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.070024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.070047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.074160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.075279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.075773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.076272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.076821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.078836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.080904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.083002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.084232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.084639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.084666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.087606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.088117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.089946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.092034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.092380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.093610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.095453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.722 [2024-07-12 14:00:53.097535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.099636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.100159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.100181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.105009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.107093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.108303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.110127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.110471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.112581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.113247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.113743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.114243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.114789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.114810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.118670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.120775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.122858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.123359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.123899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.124413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.125501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.127243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.129331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.129678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.129698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.132249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.132753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.133254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.134805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.135185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.137304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.139307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.140972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.142807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.143161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.143183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.147329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.149173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.151216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.152995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.153354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.155200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.157210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.158843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.159342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.159868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.159889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.164300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.165986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.167859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.169981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.170337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.170855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.171352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.171845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.173830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.174188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.174210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.178309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.179523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.180029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.180524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.181091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.182995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.184989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.187079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.188378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.188723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.188745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.191637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.192140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.193971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.196053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.196400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.197650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.199483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.201572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.203664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.204167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.204189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.208933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.211034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.212274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.214124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.214467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.216583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.217487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.217985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.218476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.219027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.219048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.222908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.224992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.225051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.227036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.227559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.228073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.228570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.229582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.723 [2024-07-12 14:00:53.231420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.231763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.231783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.235864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.236370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.236864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.236921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.237481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.238887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.240712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.242787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.244583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.244947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.244967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.247291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.247348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.247401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.247453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.247952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.248027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.248083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.248137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.248190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.248746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.248768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.250844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.250900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.250958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.251009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.251378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.251449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.251502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.251562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.251619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.251965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.251987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.254891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.254957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.255014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.255066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.255460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.255532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.255584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.255635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.255686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.256037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.256058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.258163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.258228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.258280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.258331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.258723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.258793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.258857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.258922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.258981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.259554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.259576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.262104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.262164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.262215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.262266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.262605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.262680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.262732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.262783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.262842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.263318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.263339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.265609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.265666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.265718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.265770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.266254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.266321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.266377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.266429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.266481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.267033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.267055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.269136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.269191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.269243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.269295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.269703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.269774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.269825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.269877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.269944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.270287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.270308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.273139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.273197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.273254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.273307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.273644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.273716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.273779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.273835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.273887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.724 [2024-07-12 14:00:53.274231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.274253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.276424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.276501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.276555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.276607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.277178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.277247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.277300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.277352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.277405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.277969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.277991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.280153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.280239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.280294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.280346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.280770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.280838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.280889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.280947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.281006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.281346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.281366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.283908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.283982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.284036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.284087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.284642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.284706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.284760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.284812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.284864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.285368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.285389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.288523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.288600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.288667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.288719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.289206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.289284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.289338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.289410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.289475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.290075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.290097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.292961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.293017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.293068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.293119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.293663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.293731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.293784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.293844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.293897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.294497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.294519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.297492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.297548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.297600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.297651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.298203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.298269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.298322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.298376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.298440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.298944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.725 [2024-07-12 14:00:53.298971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.302048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.302120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.302190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.302243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.302678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.302753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.302809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.302861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.302939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.303544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.303566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.306458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.306515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.306570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.306622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.307173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.307242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.307297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.307350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.307417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.308027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.308049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.310937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.311005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.311057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.311109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.311664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.311734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.311788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.311843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.311901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.312418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.312440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.315534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.315592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.315674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.315739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.316263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.316339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.316398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.316451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.316520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.317087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.317109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.320000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.320069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.320121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.320172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.320708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.320774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.320827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.320879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.985 [2024-07-12 14:00:53.320950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.321550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.321571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.324486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.324557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.324615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.324667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.325223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.325293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.325346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.325399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.325452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.325952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.325974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.329042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.329103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.329169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.329233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.329718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.329798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.329856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.329907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.329984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.330593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.330614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.333515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.333572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.333624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.333676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.334235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.334300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.334354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.334406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.334458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.335065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.335087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.337972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.338037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.338102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.338162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.338715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.338781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.338834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.338886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.338945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.339439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.339461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.342451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.342528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.342586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.342667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.343249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.343343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.343401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.343453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.343505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.344043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.344065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.346895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.346957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.347025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.347077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.347647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.347718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.347771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.347824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.347877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.348424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.348447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.351447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.351513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.351578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.351631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.352204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.352269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.352322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.352374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.352427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.352887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.352907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.355849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.355930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.355995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.356064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.356667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.356746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.356811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.356863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.356915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.357455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.357477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.360415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.360488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.360545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.360597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.361159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.361225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.361278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.361330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.986 [2024-07-12 14:00:53.361383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.361976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.362001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.364901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.364964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.365031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.365085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.365660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.365729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.365782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.365835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.365889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.366344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.366365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.369399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.369468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.369531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.369598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.370088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.370164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.370220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.370271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.370323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.370885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.370906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.373796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.373871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.374372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.374427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.374986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.375052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.375111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.375167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.375219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.375719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.375740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.378739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.378796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.378848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.379350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.379862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.379944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.380002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.380054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.380105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.380679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.380701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.384243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.384762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.385684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.386914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.387322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.388048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.388544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.389044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.389545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.390085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.390106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.395025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.397108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.399199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.400538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.400978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.403090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.405177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.405678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.406176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.406782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.406804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.410455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.412291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.414294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.415954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.416485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.417003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.417505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.418984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.420813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.421169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.421191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.424947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.425448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.425953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.426449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.426803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.428617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.430610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.432297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.434292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.434646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.434667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.437845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.439852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.441878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.443953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.444356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.446370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.448399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.450474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.451677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.987 [2024-07-12 14:00:53.452308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.452332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.456865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.458597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.460596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.462687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.463045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.464667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.465173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.465672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.466174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.466526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.466547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.470788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.472874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.473615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.474114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.474653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.475170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.477003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.479055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.481116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.481615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.481640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.484197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.484704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.485534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.487365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.487713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.489053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.490600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.492596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.494699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.495053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.495075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.500040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.502084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.504184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.505423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.505800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.507916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.510006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.511035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.511530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.512032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.512053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.515693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.517527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.519620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.521709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.522243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.522755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.523258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.523755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.525594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.525953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.525975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.530163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.530671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.531176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.531690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.532205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.534047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.536131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.538212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.539436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.539816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.539837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.542770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.543742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.545556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.547645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.547998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.549266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.551118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.553177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.555150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.555648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.555669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.560445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.562526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.988 [2024-07-12 14:00:53.564018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.565850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.566206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.568232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.568731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.569234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.569730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.570184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.570206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.574076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.576093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.577864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.578366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.578906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.579430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.580762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.582606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.584690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.585044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.585065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.587679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.588191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.588689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.590666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.591021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.593140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.594618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.596550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.598640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.598993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.599014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.603534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.247 [2024-07-12 14:00:53.605461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.607570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.608998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.609343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.611351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.613433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.614779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.615287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.615764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.615786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.619538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.621382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.623447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.625510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.625994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.626526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.627032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.627527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.629381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.629731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.629751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.633851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.634555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.635057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.635551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.636101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.637953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.640030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.642126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.643352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.643785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.643806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.646796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.647890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.649725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.651809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.652167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.653427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.655285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.657373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.659276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.659831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.659852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.664582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.666669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.668085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.669905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.670257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.672374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.672876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.673375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.673880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.674329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.674351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.678327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.680426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.681920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.682440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.682956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.683465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.685131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.686987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.689218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.689631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.689653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.692207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.692710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.693210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.695054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.695401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.697510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.698810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.700638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.702723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.703077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.703098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.708099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.710179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.712248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.713501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.713914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.716025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.718110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.718959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.719455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.719993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.720015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.723454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.725302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.727384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.729460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.730006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.730511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.731013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.731676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.733537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.733885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.733905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.738059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.248 [2024-07-12 14:00:53.738576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.739081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.739574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.740076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.742176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.744260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.745130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.747211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.747559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.747580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.752302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.754306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.756365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.757676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.758027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.759851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.761916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.763498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.764002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.764542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.764563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.768151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.768654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.769160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.769659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.770229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.770737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.771238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.771740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.772243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.772778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.772799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.776159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.776665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.777171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.777671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.778192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.778699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.779208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.779698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.780210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.780812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.780836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.784219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.784722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.785227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.785737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.786264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.786774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.787273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.787771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.788275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.788767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.788789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.792129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.792626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.793149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.793647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.794078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.794587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.795091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.795586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.796085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.796582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.796604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.800231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.800735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.800797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.801295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.801827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.802342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.802842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.803342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.803838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.804400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.804421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.807681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.808193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.809530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.809590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.810091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.810603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.812451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.812952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.813454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.813950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.813981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.816599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.816657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.816710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.816763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.817203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.817277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.817336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.817389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.817453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.817980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.818001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.820573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.820630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.820682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.820737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.821202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.821276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.821336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.821389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.821441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.821938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.821959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.824588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.249 [2024-07-12 14:00:53.824646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.250 [2024-07-12 14:00:53.824698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.250 [2024-07-12 14:00:53.824751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.250 [2024-07-12 14:00:53.825226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.250 [2024-07-12 14:00:53.825301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.250 [2024-07-12 14:00:53.825355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.250 [2024-07-12 14:00:53.825407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.250 [2024-07-12 14:00:53.825464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.250 [2024-07-12 14:00:53.826041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.250 [2024-07-12 14:00:53.826063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.511 [2024-07-12 14:00:53.828607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.511 [2024-07-12 14:00:53.828665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.511 [2024-07-12 14:00:53.828720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.511 [2024-07-12 14:00:53.828773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.511 [2024-07-12 14:00:53.829260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.511 [2024-07-12 14:00:53.829355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.511 [2024-07-12 14:00:53.829434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.511 [2024-07-12 14:00:53.829499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.511 [2024-07-12 14:00:53.829552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.511 [2024-07-12 14:00:53.830042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.830064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.832610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.832668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.832722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.832774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.833259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.833342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.833419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.833481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.833534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.834006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.834027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.836499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.836556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.836609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.836663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.837189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.837270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.837329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.837395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.837461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.837918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.837946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.840444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.840514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.840567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.840620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.841150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.841227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.841297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.841370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.841432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.841889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.841911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.844332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.844388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.844446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.844498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.845015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.845080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.845158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.845211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.845277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.845873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.845894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.848405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.848462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.848514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.848570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.849099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.849170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.849223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.849287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.849342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.849826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.849848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.852381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.852438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.852490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.852542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.853077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.853142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.853195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.853249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.853331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.853826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.853847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.856456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.856525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.856578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.856630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.857161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.857226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.857280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.857333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.857386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.857858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.857880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.860376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.860432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.860502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.860556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.861110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.861179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.861232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.861284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.861338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.861847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.861868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.864429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.864486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.864546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.864610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.865173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.865241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.865298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.865351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.512 [2024-07-12 14:00:53.865404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.865813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.865834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.868190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.868248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.868300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.868353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.868853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.868918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.868977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.869029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.869080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.869558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.869580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.872586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.872644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.872697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.872766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.873346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.873420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.873475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.873526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.873579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.874051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.874072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.876977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.877034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.877091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.877144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.877485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.877551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.877609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.877672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.877723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.878070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.878092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.880245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.880302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.880360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.880414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.880772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.880847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.880902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.880969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.881035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.881594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.881615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.884186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.884251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.884307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.884358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.884700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.884771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.884823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.884875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.884933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.885444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.885465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.887729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.887787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.887839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.887891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.888415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.888493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.888549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.888601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.888653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.889165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.889187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.891244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.891301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.891361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.891413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.891794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.891864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.891916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.891973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.892026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.892369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.892390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.895128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.895186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.895239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.895292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.895718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.895788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.895839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.895891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.895957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.896298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.896319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.898373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.898429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.898480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.898532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.898871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.898946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.898999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.899059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.899111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.513 [2024-07-12 14:00:53.899609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.899630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.902350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.902407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.902464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.902515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.902856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.902930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.902994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.903046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.903097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.903492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.903513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.905603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.905661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.905717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.905769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.906309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.906373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.906426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.906477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.906530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.907063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.907088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.909289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.909354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.909407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.909458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.909802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.909872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.909937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.910007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.910063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.910408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.910433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.913031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.913087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.913140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.913192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.913705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.913767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.913836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.913896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.913955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.914329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.914350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.916512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.916569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.916626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.916686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.917030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.917099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.917151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.917202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.917253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.917711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.917732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.920692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.920748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.920810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.920863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.921208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.921279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.921332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.921383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.921439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.921779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.921799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.923997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.924061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.924115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.924167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.924667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.924745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.924800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.924854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.924906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.925396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.925417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.927814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.927875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.927934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.927986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.928420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.928489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.928546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.928597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.928648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.929030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.929051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.931577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.931635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.932142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.932203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.932667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.932739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.932791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.932843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.932895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.514 [2024-07-12 14:00:53.933297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.933318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.935406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.935463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.935515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.937607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.938063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.938136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.938189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.938240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.938292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.938813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.938834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.943425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.944652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.946512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.948599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.948947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.949462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.949963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.950462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.951590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.952007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.952028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.956162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.958072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.958568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.959077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.959618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.960754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.962593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.964682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.966680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.967087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.967109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.969939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.970440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.972105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.973960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.974306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.976312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.978094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.980055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.982161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.982553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.982575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.987394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.989476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.991468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.993174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.993568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.995658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.997650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.998153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.998646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.999210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:53.999234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.003128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.005141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.007211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.008611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.009168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.009674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.010172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.011993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.013986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.014330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.014350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.017622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.018130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.018623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.019318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.019707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.021826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.023921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.025147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.026984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.027328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.027349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.030667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.032506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.034632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.036724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.037267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.039136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.041236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.043331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.043835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.044382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.044404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.049065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.050407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.052247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.054337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.054683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.055209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.055708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.056205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.057838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.515 [2024-07-12 14:00:54.058225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.058246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.062384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.063677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.064194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.064688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.065226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.067230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.069278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.071357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.072609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.072977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.072999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.075975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.076637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.078465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.080553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.080896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.082156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.083979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.086079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.088153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.088640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.516 [2024-07-12 14:00:54.088663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.093501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.095575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.096810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.098635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.098985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.101094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.101629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.102128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.102639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.103133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.103155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.107118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.109135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.110848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.111345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.111865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.112384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.113895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.115724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.117731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.118084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.118107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.120728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.121237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.121728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.123554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.123906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.126012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.127261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.129098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.131101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.131443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.776 [2024-07-12 14:00:54.131465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.135449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.137225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.139294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.141362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.141813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.143656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.145728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.147817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.148321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.148844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.148866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.153600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.154842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.156701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.158779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.159134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.159646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.160146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.160637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.162146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.162528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.162550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.165781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.166808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.167463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.168995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.169503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.171086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.172919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.174966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.176559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.176906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.176934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.179784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.180291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.182305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.184388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.184733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.186257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.188202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.190199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.192296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.192700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.192722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.196225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.196743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.197242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.197738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.198220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.198730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.199236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.199729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.200234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.200713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.200740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.204437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.204949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.205442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.205940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.206443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.206959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.207456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.207955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.208450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.208978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.209001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.212617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.213138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.213634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.214131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.214604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.215121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.215619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.216117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.216614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.217068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.217090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.220564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.221094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.221591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.222103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.222623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.223159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.223658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.224160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.224658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.225158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.225180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.228628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.229142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.229638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.230137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.230655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.231173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.231672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.232171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.232664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.233126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.777 [2024-07-12 14:00:54.233149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.236595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.237113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.237610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.238110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.238638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.239168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.239672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.240172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.240664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.241202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.241224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.244642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.245160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.245660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.246160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.246691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.247208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.247713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.248215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.248714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.249239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.249262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.252708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.253221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.253722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.254220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.254763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.255288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.255802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.256303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.256806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.257361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.257384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.260785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.261308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.261798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.262294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.262832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.263345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.263844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.264343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.264836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.265368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.265391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.268663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.269174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.269676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.270179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.270729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.271247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.271748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.272250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.272743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.273266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.273291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.276244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.276748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.277252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.277753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.278315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.278821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.279319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.279816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.280343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.280889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.280911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.285564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.286808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.288643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.290736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.291091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.291632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.292131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.292639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.294022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.294426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.294447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.298467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.300052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.300552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.301050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.301594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.303283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.305163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.307266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.308770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.309123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.309145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.312056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.312555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.312614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.314456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.314800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.316924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.318155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.319985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.322059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.322403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.778 [2024-07-12 14:00:54.322425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.327382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.329468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.331563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.331621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.332146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.334001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.336100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.338177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.338685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.339245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.339268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.341980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.342045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.342102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.342154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.342547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.342621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.342677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.342733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.342784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.343211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.343233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.345811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.345870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.345954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.346008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.346606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.346671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.346726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.346778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.346830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.347208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.347229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.349379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.349436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.349488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.349539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.349877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.349954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.350007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.350064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.350141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.350483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.350504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.353550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.353607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.353658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.353717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.354062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.354132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.354193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.354246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.354298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.354639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.779 [2024-07-12 14:00:54.354660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.356864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.356933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.356987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.357039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.357596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.357664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.357718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.357772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.357825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.358317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.358339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.360691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.360761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.360813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.360865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.361347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.361418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.361470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.361521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.361573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.361969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.361991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.364734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.364792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.364845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.364898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.365438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.365503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.365560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.365611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.041 [2024-07-12 14:00:54.365662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.366035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.366056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.368241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.368298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.368363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.368418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.368757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.368828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.368880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.368939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.368991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.369497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.369519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.372354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.372411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.372467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.372519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.372856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.372938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.372991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.373059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.373111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.373468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.373489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.375630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.375687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.375739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.375791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.376326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.376390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.376444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.376497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.376550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.377138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.377161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.379356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.379414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.379466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.379522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.379864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.379943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.380000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.380050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.380101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.380441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.380462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.383254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.383313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.383373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.383426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.383816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.383882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.383942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.384001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.384055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.384398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.384418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.386577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.386634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.386686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.386744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.387088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.387165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.387221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.387273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.387325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.387878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.387899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.390514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.390575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.390627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.390679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.391021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.391091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.391144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.391195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.391260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.391724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.391745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.394138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.394197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.394250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.394301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.394828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.394891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.394950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.395004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.395056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.395591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.395613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.397777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.397834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.397890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.397949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.398304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.398373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.398434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.398489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.042 [2024-07-12 14:00:54.398541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.398883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.398904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.403134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.403197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.403248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.403300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.403638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.403713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.403780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.403832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.403885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.404233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.404255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.409424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.409491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.409543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.409594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.409978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.410051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.410103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.410155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.410206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.410543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.410563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.417021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.417085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.417137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.417189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.417701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.417764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.417818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.417871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.417924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.418458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.418479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.423556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.423638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.423695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.423755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.424163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.424237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.424308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.424361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.424413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.424994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.425016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.429991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.430054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.430105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.430156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.430553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.430629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.430681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.430746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.430802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.431147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.431168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.435729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.435792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.435843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.435894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.436240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.436311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.436362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.436421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.436474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.436879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.436900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.441830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.441899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.441963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.442019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.442395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.442468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.442520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.442571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.442623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.442967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.442989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.449494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.449558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.449619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.449671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.450148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.450212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.450264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.450316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.450368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.450906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.450935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.456186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.456253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.456314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.456366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.456735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.456809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.456864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.456941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.456994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.043 [2024-07-12 14:00:54.457583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.457609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.462627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.462691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.462743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.462793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.463177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.463248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.463301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.463352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.463424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.463764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.463785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.468582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.468658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.468710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.468761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.469107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.469180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.469232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.469288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.469342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.469742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.469763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.474700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.474763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.474841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.474896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.475281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.475356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.475412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.475468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.475519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.475859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.475879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.482587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.482651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.482705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.482757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.483288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.483356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.483408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.483461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.483515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.484062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.484084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.489184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.489248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.489310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.489361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.489740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.489810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.489863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.489943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.489998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.490595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.490616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.495664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.495726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.497566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.497623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.497975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.498048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.498107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.498162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.498213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.498626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.498647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.501662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.501726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.501783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.503858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.504205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.504277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.504329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.504381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.504432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.504964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.504985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.507746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.508250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.509887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.511728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.512075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.514085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.515796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.517678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.519795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.520168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.520189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.525288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.527293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.528723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.530801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.531147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.531657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.532159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.532651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.534170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.534548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.044 [2024-07-12 14:00:54.534568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.538672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.540013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.540512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.541011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.541555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.543549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.545578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.547666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.548851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.549201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.549224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.552144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.552644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.553148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.553644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.554183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.554687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.555201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.555707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.556210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.556751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.556774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.560357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.560860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.561366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.561862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.562362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.562864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.563365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.563865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.564365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.564921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.564949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.568428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.568935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.569435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.569936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.570452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.570962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.571459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.571963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.572456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.573011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.573033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.576471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.576979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.577479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.577979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.578524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.579034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.579536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.580051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.580549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.581164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.581187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.584742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.585258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.585761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.586265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.586833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.587349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.587850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.588350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.588848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.589424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.589447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.592874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.593387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.593889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.594388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.595010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.595516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.596022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.596513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.597010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.597548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.597570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.601007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.601513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.602015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.602504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.603078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.045 [2024-07-12 14:00:54.603586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.604092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.604604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.605101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.605635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.605656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.609145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.609650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.610147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.610637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.611180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.611685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.612189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.612682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.613178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.613716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.046 [2024-07-12 14:00:54.613737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.304 [2024-07-12 14:00:54.676368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.678056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.681962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.683860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.683933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.684303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.684698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.685148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.685218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.685591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.685655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.686030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.686090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.687578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.688913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.689200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.689223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.689244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.696508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.698442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.698846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.699244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.699697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.700108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.700509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.701422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.702824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.703108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.703130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.703151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.706268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.708195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.710030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.711700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.712084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.712498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.712898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.713300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.713699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.714147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.714174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.714200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.721478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.722939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.724602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.726261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.726537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.726957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.727353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.727746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.728158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.728605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.728634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.728662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.731955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.733293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.734678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.736005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.736280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.738064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.739735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.740500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.740913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.741369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.741395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.741422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.747712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.749646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.750301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.751957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.752236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.753911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.304 [2024-07-12 14:00:54.755610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.757044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.757440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.757912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.757947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.757971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.761584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.763511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.765441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.767156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.767571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.769235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.770554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.772232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.773908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.774192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.774215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.774236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.779253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.781204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.782847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.784539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.784893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.786229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.787542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.789225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.791038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.791318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.791340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.791360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.794084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.794483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.796411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.798016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.798297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.799988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.801707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.802726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.804049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.804326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.804348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.804369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.808794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.809208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.810697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.812002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.812280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.814156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.815824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.816517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.818435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.818783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.818805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.818826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.820870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.821277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.821673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.822076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.822488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.824421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.825983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.827649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.829318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.829597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.829621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.829646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.834538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.834948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.835345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.835743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.836190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.837669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.838960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.840621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.842348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.842627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.842649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.842670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.846232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.848154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.848571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.848974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.849424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.849831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.850232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.851084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.852549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.852827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.852849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.852869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.858937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.860616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.861615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.862018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.862450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.862868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.863273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.863672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.865602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.865938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.865961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.865981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.869463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.870786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.872460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.874127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.874405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.874817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.876740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.877150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.877547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.877872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.877895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.305 [2024-07-12 14:00:54.877915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.884998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.885558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.887313] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.889090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.889368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.891065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.892610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.893012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.893410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.893844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.893868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.893889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.898042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.899854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.900348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.902261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.902611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.904309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.905966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.907787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.908189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.908678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.908702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.908727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.912213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.912621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.913022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.913418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.913855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.914269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.914671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.915075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.915475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.915919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.915953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.915975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.918932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.919339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.919734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.920134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.920576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.920991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.921399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.921805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.922207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.922659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.566 [2024-07-12 14:00:54.922688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.922710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.926237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.926642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.927046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.927444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.927854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.928275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.928683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.929091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.929491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.929898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.929922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.929949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.932625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.933032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.933431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.933834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.934296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.934711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.935120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.935524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.935922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.936362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.936390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.936412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.939970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.940378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.940774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.941177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.941631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.942051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.942452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.942850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.943250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.943730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.943754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.943779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.946375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.946776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.947182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.947579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.948002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.948418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.948835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.949243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.949301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.949754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.949779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.949803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.953306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.953710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.954116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.954527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.954893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.955312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.955710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.956120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.956516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.957063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.957089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.957115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.959757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.959820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.960220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.960608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.961017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.961433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.961456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.961877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.962283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.962708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.963102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.963501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.963968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.963996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.964019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.964045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.967450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.967517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.567 [2024-07-12 14:00:54.967909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.967978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.968346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.968370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.968786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.968841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.969239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.969289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.969736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.969764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.969789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.969810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.972475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.972545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.972943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.972993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.973423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.973448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.973862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.973911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.974309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.974363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.974828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.974852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.974877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.974903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.978305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.978368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.978772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.978828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.979216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.979240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.979647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.979700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.980106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.980161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.980605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.980628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.980648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.980673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.983455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.983516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.983914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.983973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.984416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.984444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.984862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.984915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.985319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.985372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.985737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.985768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.985790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.985812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.989446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.989520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.989921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.990001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.990459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.990482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.990895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.990959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.991357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.991407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.991862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.991887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.991909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.991934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.994685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.994749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.995149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.995200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.995632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.995656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.996071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.996127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.996523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.996585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.568 [2024-07-12 14:00:54.996997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:54.997021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:54.997042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:54.997062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.000732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.000805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.001208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.001282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.001798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.001821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.002237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.002294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.002689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.002738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.003186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.003212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.003239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.003261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.005839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.005900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.005954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.006008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.006419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.006443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.006850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.006905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.006962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.007015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.007443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.007465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.007486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.007506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.010636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.010693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.010746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.010793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.011202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.011225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.011297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.011358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.011414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.011462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.011890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.011914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.011939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.011960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.014391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.014446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.014497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.014547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.014988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.015017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.015079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.015131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.015180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.015240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.015517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.015539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.015560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.015580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.018942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.019002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.019054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.019103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.019547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.019574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.019634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.019701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.019762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.019811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.020095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.020118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.020138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.020158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.021855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.021915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.569 [2024-07-12 14:00:55.021970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.022020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.022459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.022483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.022539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.022588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.022638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.022694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.023144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.023170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.023196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.023220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.027540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.027596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.027643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.027689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.027968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.027992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.028056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.028103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.028150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.028196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.028613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.028653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.028675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.028697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.030279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.030341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.030388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.030434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.030706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.030729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.030791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.030843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.030891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.030944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.031363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.031391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.031414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.031435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.034214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.034270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.034316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.034362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.034634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.034657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.034721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.034768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.034815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.034861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.035138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.035161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.035181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.035201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.036908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.036965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.037931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.041227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.041283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.041329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.041376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.041708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.041730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.041794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.041840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.041887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.041938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.042211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.042233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.042254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.042273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.570 [2024-07-12 14:00:55.044009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.044987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.045007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.045027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.049697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.049759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.049807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.049860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.050195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.050218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.050277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.050325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.050371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.050417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.050746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.050769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.050789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.050809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.052429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.052501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.052553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.052599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.052872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.052894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.052977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.053026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.053099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.053147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.053423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.053445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.053466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.053485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.057860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.057916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.057967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.058016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.058455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.058488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.058548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.058601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.058653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.058704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.059058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.059081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.571 [2024-07-12 14:00:55.059101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.059121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.060676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.060726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.060772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.060819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.061241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.061279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.061342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.061400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.061446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.061493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.061765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.061787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.061808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.061828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.067137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.067198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.067251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.067305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.067720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.067756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.067813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.067861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.067913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.067967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.068407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.068433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.068457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.068482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.070157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.070207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.070257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.070303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.070577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.070599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.070662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.070709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.070758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.070804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.071223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.071249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.071270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.071291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.076671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.076731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.076779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.076827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.077285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.077311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.077373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.077423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.077474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.077522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.077939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.077976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.078010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.078030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.080198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.080248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.080294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.080340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.080611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.080633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.080696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.080743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.080790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.080836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.081112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.081135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.081156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.081176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.085525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.085581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.572 [2024-07-12 14:00:55.085627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.085673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.086142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.086166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.086245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.086305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.086354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.086401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.086855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.086885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.086906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.086942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.089244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.089296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.089347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.089394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.089734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.089757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.089821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.089869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.090181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.090206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.095450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.095505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.095553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.095599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.095872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.095895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.095968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.096017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.096062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.096109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.096534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.096570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.099077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.099131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.099185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.099235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.099630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.099653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.099712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.099763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.099822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.101102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.101154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.101429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.101451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.101472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.106821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.106876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.106922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.106974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.107248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.107271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.107334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.107381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.107427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.107473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.107751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.107775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.107798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.110590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.110650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.110698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.110750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.111031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.111055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.111111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.111157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.111211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.111258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.111539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.111568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.111588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.115940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.117591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.117671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.119315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.119666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.119690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.573 [2024-07-12 14:00:55.119756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.119816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.120196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.120276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.120649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.121063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.121087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.121110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.121131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.123139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.124766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.124821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.126479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.126787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.126811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.126918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.128381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.128460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.129760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.130043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.130066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.130087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.130107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.134318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.134726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.134776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.135939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.136233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.136257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.136344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.138257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.138321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.140249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.140729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.140752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.140774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.574 [2024-07-12 14:00:55.140797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.146703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.147119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.147178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.147567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.148030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.148056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.148119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.148924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.148981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.150601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.150882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.150905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.150930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.150950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.156379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.158059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.158116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.159562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.159946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.159971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.160029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.160428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.160480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.160874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.161383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.161412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.161434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.161459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.165070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.165727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.165780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.167635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.167921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.167948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.168011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.169684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.169735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.171403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.171691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.171715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.171737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.171758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.174822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.176145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.176198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.177858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.178141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.178168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.178234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.836 [2024-07-12 14:00:55.180161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.180219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.181096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.181378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.181400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.181433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.181454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.186676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.187097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.187501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.187897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.188184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.188207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.188267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.189567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.191228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.192892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.193188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.193211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.193232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.193252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.197573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.197984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.198384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.198783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.199249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.199274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.200110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.201580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.203504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.205360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.205641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.205664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.205684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.205704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.210695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.211109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.211506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.211901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.212367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.212397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.212794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.214716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.216130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.217802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.218087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.218111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.218131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.218151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.224565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.224974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.225371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.225765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.226277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.226307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.226711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.228273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.229582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.231248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.231536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.231559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.231579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.231599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.238554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.238980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.239382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.239776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.240290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.240315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.240720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.241835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.243148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.244951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.245233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.245256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.245276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.245297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.251607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.252220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.252623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.253027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.253446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.253471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.253876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.254438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.256184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.257939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.258220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.258244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.258268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.258288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.264620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.265753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.266158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.266556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.266979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.267007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.267412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.837 [2024-07-12 14:00:55.267810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.269660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.270983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.271265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.271288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.271309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.271330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.277941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.279464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.279863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.280263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.280721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.280745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.281180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.281576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.282998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.284319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.284601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.284623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.284644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.284664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.291897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.293746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.294151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.294548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.294992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.295018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.295422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.295825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.296987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.298299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.298580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.298603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.298623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.298643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.305883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.307827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.308249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.308645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.309088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.309117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.309522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.309917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.310667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.312234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.312517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.312539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.312559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.312580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.318687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.320371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.321306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.321714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.322169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.322199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.322603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.323008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.323405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.325326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.325665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.325687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.325707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.325728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.331403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.333091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.334481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.334878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.335338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.335367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.335769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.336175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.336575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.338159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.338509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.338532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.338552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.338572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.344095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.345783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.347688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.348096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.348584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.348612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.349031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.349430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.349827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.351022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.351338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.351362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.351383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.351403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.357739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.359671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.361421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.361988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.362514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.362538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.362953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.363350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.363746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.838 [2024-07-12 14:00:55.364416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.364697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.364720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.364740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.364760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.371433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.373109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.374779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.375805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.376238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.376264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.376672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.377072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.377468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.377866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.378157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.378180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.378201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.378220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.384496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.386169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.387849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.389275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.389651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.389675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.390093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.390488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.390887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.391285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.391650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.391673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.391694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.391714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.397194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.398881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.400774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.401186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.401670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.401699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.402109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.402507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.402901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.404074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.404383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.404411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.404432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.404452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.408457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.408859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.409265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.409665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.410062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.410085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.410490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.410886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.411285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.411686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.412139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.412164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.412189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:06.839 [2024-07-12 14:00:55.412211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.415650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.416063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.416467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.416871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.417398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.417427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.417835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.418237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.418633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.419038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.419459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.419483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.419504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.419524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.423199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.423609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.424014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.424416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.424825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.424858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.425273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.425668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.103 [2024-07-12 14:00:55.426071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.426484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.426872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.426896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.426918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.426945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.430356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.430768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.431169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.431588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.432067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.432093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.432498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.432904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.433331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.433734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.434194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.434224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.434251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.434273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.437785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.438195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.438600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.439002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.439391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.439414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.439825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.440230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.440629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.441029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.441520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.441549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.441572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.441598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.444986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.445394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.446508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.448175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.448456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.448478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.448890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.448952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.450804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.452725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.453191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.453214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.453234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.453256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.456664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.457076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.457475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.457871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.458343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.458376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.458778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.459187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.459590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.459993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.460429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.460457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.460480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.460503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.463748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.464165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.464571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.464984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.465428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.465456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.465862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.466279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.466680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.467090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.467567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.467597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.467618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.467638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.471216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.471280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.471690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.471741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.472382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.472409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.472822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.472884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.473303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.473359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.473815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.473839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.473861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.473882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.477248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.477310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.477703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.477757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.478163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.478188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.478591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.478644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.104 [2024-07-12 14:00:55.479051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.479108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.479506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.479530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.479552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.479572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.483270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.483340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.483744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.483798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.484300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.484325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.484731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.484783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.485185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.485236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.485679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.485704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.485730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.485754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.489346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.489413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.489813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.489877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.490354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.490388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.490797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.490871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.491278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.491329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.491786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.491814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.491841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.491863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.495385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.495450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.495843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.495891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.496342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.496368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.496773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.496837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.497251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.497325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.497770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.497794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.497816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.497855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.501299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.501380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.501779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.501833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.502286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.502312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.502717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.502772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.503178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.503232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.503655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.503681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.503704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.503739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.507398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.507470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.507870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.507940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.508348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.508377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.508786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.508843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.509245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.509308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.509818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.509847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.509869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.509894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.513455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.513523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.513576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.513624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.514075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.514101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.515033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.515089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.515136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.515182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.515517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.515542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.515564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.515586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.518894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.518973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.519021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.519083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.519576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.519599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.519669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.519717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.519779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.519840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.105 [2024-07-12 14:00:55.520300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.520337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.520365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.520397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.523294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.523352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.523399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.523445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.523724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.523747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.523810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.523858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.523904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.523959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.524238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.524261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.524281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.524301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.529346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.529406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.529458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.529510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.529789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.529812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.529867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.529914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.529976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.530036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.530531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.530558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.530583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.530606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.533199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.533263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.533315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.533368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.533644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.533669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.533728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.533774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.533838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.533888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.534170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.534194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.534214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.534234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.539214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.539273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.539320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.539366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.539641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.539664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.539727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.539777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.539824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.539874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.540312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.540337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.540359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.540381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.543170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.543227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.543274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.543320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.543596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.543618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.543681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.543728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.543774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.543824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.544110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.544133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.544154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.544174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.548276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.548333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.548379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.548435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.548825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.548848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.548907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.548963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.549010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.549056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.549355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.549378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.549399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.549421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.552547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.552603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.552653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.552699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.553081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.553104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.553168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.553215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.553265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.553311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.553585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.553611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.106 [2024-07-12 14:00:55.553632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.553652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.558593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.558650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.558697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.558744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.559028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.559052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.559117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.559165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.559212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.559259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.559716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.559739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.559763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.559783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.562705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.562769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.562815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.562861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.563144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.563168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.563239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.563291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.563337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.563383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.563662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.563685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.563706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.563725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.568979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.569038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.569085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.569131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.569477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.569501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.569566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.569614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.569681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.569730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.570016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.570041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.570061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.570081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.574225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.574281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.574331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.574378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.574665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.574690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.574752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.574806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.574857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.574904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.575190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.575216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.575236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.575258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.580493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.580551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.580600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.580653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.580939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.580964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.581028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.581075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.581122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.581168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.581617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.581645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.581670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.581693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.584635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.584698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.584747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.584794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.585080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.585105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.585170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.585217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.585263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.585309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.585583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.585606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.585627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.585647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.590886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.107 [2024-07-12 14:00:55.590952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.591000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.591048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.591455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.591487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.591551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.591603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.591650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.591696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.591982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.592006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.592026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.592051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.595715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.595771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.595822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.595869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.596181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.596206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.596274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.596322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.596371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.596417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.596694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.596717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.596737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.596757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.601901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.601968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.602016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.602062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.602339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.602361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.602425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.602471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.602522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.602569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.603044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.603068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.603089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.603108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.606992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.607013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.607033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.612394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.612451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.612498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.612544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.612989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.613014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.613076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.613143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.613195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.613242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.613526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.613549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.613569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.613589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.617502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.617559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.617609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.617655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.617972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.617996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.618060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.618108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.618157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.618203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.618481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.618506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.618526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.618546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.623644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.623700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.623747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.623794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.624079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.624102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.624165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.624212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.624258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.624304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.624787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.624810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.108 [2024-07-12 14:00:55.624831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.624855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.627803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.627864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.627912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.627965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.628243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.628266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.628330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.630244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.630300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.630346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.630619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.630642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.630662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.630682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.634904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.634971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.635022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.635069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.635346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.635368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.635428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.635475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.635521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.635567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.636071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.636095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.636120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.636141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.640051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.640119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.640166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.640212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.640493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.640515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.640579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.640626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.640672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.640718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.641001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.641025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.641046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.641066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.645137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.645629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.645694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.646093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.646557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.646586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.646652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.647057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.647112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.647510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.647912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.647959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.647980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.648000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.652593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.653014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.653069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.653463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.653749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.653772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.653830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.654280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.654334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.654725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.655058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.655083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.655103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.655123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.660408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.662238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.662294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.664114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.664394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.664417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.664480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.665443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.665495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.665908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.666359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.666388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.666411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.666436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.670469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.672158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.672211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.673356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.673656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.673678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.673739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.675056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.675110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.676785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.677074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.677098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.677119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.109 [2024-07-12 14:00:55.677139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.681430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.682591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.682645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.683050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.683485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.683508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.683573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.685269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.685331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.687254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.687533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.687556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.687577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.687598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.692317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.694185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.694240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.694644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.695155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.695185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.695248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.695649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.695713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.696123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.696571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.696595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.696620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.696645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.701835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.703780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.703831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.705418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.705697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.705719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.705784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.707441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.707495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.708739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.709084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.709107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.709128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.709147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.714846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.716413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.717731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.719415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.719694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.719717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.719783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.721714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.370 [2024-07-12 14:00:55.722402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.724037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.724319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.724341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.724367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.724387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.728852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.729260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.730878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.732198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.732477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.732499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.734440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.736343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.736817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.738707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.739002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.739025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.739045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.739066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.743668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.745022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.746002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.746401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.746849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.746872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.748808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.750351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.752033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.753695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.753982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.754018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.754039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.754060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.759140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.759551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.759954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.760352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.760808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.760838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.762632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.763943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.765605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.767277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.767559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.767582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.767602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.767623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.772681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.774111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.774512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.774906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.775193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.775217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.776133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.776540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.777331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.778850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.779139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.779162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.779183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.779203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.785436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.787121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.787961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.788381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.788841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.788870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.789282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.789680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.790137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.792021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.792307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.792330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.792351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.792371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.798164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.799845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.801058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.802456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.802760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.802785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.803214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.803611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.805534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.805947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.806406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.806431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.806456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.806482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.812701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.814049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.815723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.817409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.817689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.817716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.818211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.818609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.819016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.819411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.371 [2024-07-12 14:00:55.819879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.819908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.819942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.819967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.825499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.826833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.828518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.830424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.830704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.830726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.831547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.833386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.833905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.834311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.834680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.834703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.834724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.834744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.842014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.843965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.844769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.846285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.846565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.846588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.848281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.849961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.851146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.851544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.852013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.852042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.852064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.852090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.858177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.860103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.860908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.862434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.862714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.862737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.864435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.866119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.867266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.868698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.869012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.869036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.869058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.869079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.873432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.875367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.877290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.878965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.879382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.879418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.881077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.882399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.884089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.885765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.886058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.886081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.886102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.886122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.889045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.890538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.891837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.893512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.893789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.893812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.895556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.896124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.898050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.899650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.899936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.899958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.899978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.899998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.902046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.902450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.903830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.904778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.905264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.905291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.905692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.907615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.909182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.910851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.911135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.911158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.911178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.911202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.914573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.916260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.917237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.917634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.918087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.918111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.918514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.918916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.919315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.921225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.372 [2024-07-12 14:00:55.921599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.921622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.921642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.921663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.925144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.926826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.928502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.929386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.929664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.929687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.930822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.931229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.931624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.933546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.934056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.934079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.934102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.934128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.937896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.938346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.940215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.941709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.941994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.942017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.943713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.945441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.945839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.946244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.946696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.946723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.946746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.373 [2024-07-12 14:00:55.946766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.950051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.951637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.952041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.952436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.952716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.952738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.953829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.954238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.954637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.955056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.955427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.955451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.955473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.955508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.958389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.958794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.959206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.960734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.961085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.961110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.961523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.961918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.963840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.964247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.964719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.964749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.964775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.964801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.967473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.967876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.968276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.968677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.969084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.969108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.969529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.971453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.971856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.972259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.972609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.972631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.972652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.972672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.975258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.975669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.976076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.976483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.976995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.977024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.977425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.977830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.978246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.979033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.979311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.979334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.979354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.979374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.982097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.982500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.982912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.983317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.983780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.983804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.984221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.984613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.985016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.985426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.985771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.985795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.985816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.985842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.988310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.990190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.990718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.991120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.991541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.991564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.992003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.992060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.992471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.992870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.993346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.993371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.993396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.993420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.996149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.998071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.998475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.998871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.999246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:55.999270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:56.000937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:56.001332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:56.001729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:56.002726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:56.003010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:56.003036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.640 [2024-07-12 14:00:56.003058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.003079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.005270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.005673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.006087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.006491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.006840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.006863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.008312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.010237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.012136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.012564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.012843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.012866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.012891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.012911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.016387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.016444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.016840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.016892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.017340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.017365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.019009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.019061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.019546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.019602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.020067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.020096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.020118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.020144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.022929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.022986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.023393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.023445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.023902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.023936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.024350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.024407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.024805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.024861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.025292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.025316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.025336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.025356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.028233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.028290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.028689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.028742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.029137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.029161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.029566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.029635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.030057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.030111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.030600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.030628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.030651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.030674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.033430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.033502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.034288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.034340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.034617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.034640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.035056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.035109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.035506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.035571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.035848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.035871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.035892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.035911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.038606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.038683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.039085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.039141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.039604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.039629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.040039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.040090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.040488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.040543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.040937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.040960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.040981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.041001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.043455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.043512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.044637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.044689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.044990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.045014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.045425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.045481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.045878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.045939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.046324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.046349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.046371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.641 [2024-07-12 14:00:56.046391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.049168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.049226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.049622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.049676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.050075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.050100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.050514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.050571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.052152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.052204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.052621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.052645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.052667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.052689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.055269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.055328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.055376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.055437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.055789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.055814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.056232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.056290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.056338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.056385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.056850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.056874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.056896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.056916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.059431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.059485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.059558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.059629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.060049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.060073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.060141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.060206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.060276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.060324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.060727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.060750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.060770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.060790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.062895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.062951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.062998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.063044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.063526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.063550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.063617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.063666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.063719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.063767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.064215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.064240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.064266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.064287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.066599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.066652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.066705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.066754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.067144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.067168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.067223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.067271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.067319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.067368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.067815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.067847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.067874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.067896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.069952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.070001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.070051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.070106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.070616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.070640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.070700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.070750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.070802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.070853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.071300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.071324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.071350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.071371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.073608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.073674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.073734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.073781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.642 [2024-07-12 14:00:56.074224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.074247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.074316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.074376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.074424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.074472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.074911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.074940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.074963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.074988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.077606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.077670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.077718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.077779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.078264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.078288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.078355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.078405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.078487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.078560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.078980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.079004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.079024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.079044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.081151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.081201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.081247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.081293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.081734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.081758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.081819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.081869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.081920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.081974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.082424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.082449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.082474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.082498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.084853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.084910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.084974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.085028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.085426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.085450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.085505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.085553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.085604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.085652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.086113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.086141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.086163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.086187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.088255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.088305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.088351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.088415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.088945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.088969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.089030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.089080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.089132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.089183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.089620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.089649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.089675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.089695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.091935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.092000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.092047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.092096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.092529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.092552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.092620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.092669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.092719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.092771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.093188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.093213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.093236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.093258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.095867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.095918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.095976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.096023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.096294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.096317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.096369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.096426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.096489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.096540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.097020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.097043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.097064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.097084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.099468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.099525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.643 [2024-07-12 14:00:56.099577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.099643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.099919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.099948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.100006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.100057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.100107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.100174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.100701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.100725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.100752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.100776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.102694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.102765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.102814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.102863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.103248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.103271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.103338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.103385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.103432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.103479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.103752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.103775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.103799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.103821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.106062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.106117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.106172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.106219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.106494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.106517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.106570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.106619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.106689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.106737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.107229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.107257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.107284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.107308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.109133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.109183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.109229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.109276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.109547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.109569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.109632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.109679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.109725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.109771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.110071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.110095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.110122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.110144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.111729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.111778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.111828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.111889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.112164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.112188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.112241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.112288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.112351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.112401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.112853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.112876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.112900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.112920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.115064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.115114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.115160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.644 [2024-07-12 14:00:56.115207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.115670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.115694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.115761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.115810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.115864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.115912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.116365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.116390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.116416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.116441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.118016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.118075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.118125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.118171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.118443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.118466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.118518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.118588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.118636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.118685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.119021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.119044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.119065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.119085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.120630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.120684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.120731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.120777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.121170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.121194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.121257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.121311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.121359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.121405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.121684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.121707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.121728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.121747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.124002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.124053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.124104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.124151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.124615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.124641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.124699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.124751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.124807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.124855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.125152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.125175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.125196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.125216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.126816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.126866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.126912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.126967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.127407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.127430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.127494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.127541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.127588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.127634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.127935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.127960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.127980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.128000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.129580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.129634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.129682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.129743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.130027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.130050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.130111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.130791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.130843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.130894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.131346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.131375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.131401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.131423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.133665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.133719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.133765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.133812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.134088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.134111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.134179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.134228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.134279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.134325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.134600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.134623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.134643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.134663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.645 [2024-07-12 14:00:56.137624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.137680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.137727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.137773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.138054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.138077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.138140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.138186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.138233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.138279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.138553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.138575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.138596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.138616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.140610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.141018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.141089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.143003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.143467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.143491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.143563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.143965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.144019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.145520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.145852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.145875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.145895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.145916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.147531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.149319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.149370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.150750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.151031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.151054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.151118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.152794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.152847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.154308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.154671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.154693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.154714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.154734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.156850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.157653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.157705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.158101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.158504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.158527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.158585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.160060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.160111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.161958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.162235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.162265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.162285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.162305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.163980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.165652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.165704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.167368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.167643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.167666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.167723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.168127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.168180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.169905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.170389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.170416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.170440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.170466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.172613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.173076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.173129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.174759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.175048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.175071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.175131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.176804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.176856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.178518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.178850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.178888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.178910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.178937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.180516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.182197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.182249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.182739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.183022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.183045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.183103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.183675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.183729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.184128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.184495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.184517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.184538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.646 [2024-07-12 14:00:56.184558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.186689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.188011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.188063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.189718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.189999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.190022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.190086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.191909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.191965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.193080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.193363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.193387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.193409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.193429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.194980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.196118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.197287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.197688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.198137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.198162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.198240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.200164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.200568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.200973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.201314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.201336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.201357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.201377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.204334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.205638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.206968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.208640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.208915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.208942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.210614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.211401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.213136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.213750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.214216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.214244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.214271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.647 [2024-07-12 14:00:56.214293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.984 [2024-07-12 14:00:56.217836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.219155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.220802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.222695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.222979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.223006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.223805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.225704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.227034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.228724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.229009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.229032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.229052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.229073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.231326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.232256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.233659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.234063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.234500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.234526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.236149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.237464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.239134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.240831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.241119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.241143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.241164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.241184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.244521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.246158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.247077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.248490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.248985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.249009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.249414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.250934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.251752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.252158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.252576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.252599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.252619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.252639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.255636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.257406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.257847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.259749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.260263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.260286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.260691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.261690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.263018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.263417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.263861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.263888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.263911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.263940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.266761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.267897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.269107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.269506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.269960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.269989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.271651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.272324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.272724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.273713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.274003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.274027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.274048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.274068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.276842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.278481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.280156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.281824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.282107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.282131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.985 [2024-07-12 14:00:56.282611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.284529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.284936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.285336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.285680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.285703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.285723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.285744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.289113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.290796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.292544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.294466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.294959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.294983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.296936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.298595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.300265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.301935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.302225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.302248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.302274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.302302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.306062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.306732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.307135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.308101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.308379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.308412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.310137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.311804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.313479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.314552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.314838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.314861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.314881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.314901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.317621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.318028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.318430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.319193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.319472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.319495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.319910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.320314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.321733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.323031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.323308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.323331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.323351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.323372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.326353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.328040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.329748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.331673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.332160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.332186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.332600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.333008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.334375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.335339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.335798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.335825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.335852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.335873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.339314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.341223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.342060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.343545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.343821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.343844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.345527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.347215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.348392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.348789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.349227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.349255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.349282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.349303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.352687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.986 [2024-07-12 14:00:56.354005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.355753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.357670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.357959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.357982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.358917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.360702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.362024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.363693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.363977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.364000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.364020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.364041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.367684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.368662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.369070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.369771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.370054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.370078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.371514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.373194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.374867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.376154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.376532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.376554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.376574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.376595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.379440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.379841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.380247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.380988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.381264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.381287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.381703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.382108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.383505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.384812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.385098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.385121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.385141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.385162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.388150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.389830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.391555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.393479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.393955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.393977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.394388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.394792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.396306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.397134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.397571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.397599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.397623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.397648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.400966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.402741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.403666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.405057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.405333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.405355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.407047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.408721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.409843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.410251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.410694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.410721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.410747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.410773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.414281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.415600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.417286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.419130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.419409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.419431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.420211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.422139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.423497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.425157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.425436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.987 [2024-07-12 14:00:56.425458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.425478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.425498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.429404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.430059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.430460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.431513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.431804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.431828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.433567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.435232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.436891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.437993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.438279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.438305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.438326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.438346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.441067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.441482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.441880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.442977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.443255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.443289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.443699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.444105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.445899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.447197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.447478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.447500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.447520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.447540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.450534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.452220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.453871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.455467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.455924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.455957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.456368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.456758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.458668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.459075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.459535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.459564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.459586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.459608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.462857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.464083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.465584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.466901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.467184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.467207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.469134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.469192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.471112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.471526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.472067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.472096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.472121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.472145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.474905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.476548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.478482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.480158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.480437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.480460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.481806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.483186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.484500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.486178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.486456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.486478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.486498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.486519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.988 [2024-07-12 14:00:56.488869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.490474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.491197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.491596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.492006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.492030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.493771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.495565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.497297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.499232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.499740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.499763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.499783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.499803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.503159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.503216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.503611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.503662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.504136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.504162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.504779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.504833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.506343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.506394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.506884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.506909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.506942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.506965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.510607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.510667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.512507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.512559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.512998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.513025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.514604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.514660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.516579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.516640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.516913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.516939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.516960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.516980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.520342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.520398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.521279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.521330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.521764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.521794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.522431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.522483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.523965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.524018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.524296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.524318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.524338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.524358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.527316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.527373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.529050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.529100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.529377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.529399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.531129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.531189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.531587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.531650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.532173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.532198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.532225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.532247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.534743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.534800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.536505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.536557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.536834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.536857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.989 [2024-07-12 14:00:56.538528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.538582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.539065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.539117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.539395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.539418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.539438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.539458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.541324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.541394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.541789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.541842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.542260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.542283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.544212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.544271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.544669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.544723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.545184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.545213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.545240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.545264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.548750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.548807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.549216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.549276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.549808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.549833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.550246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.550314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.550716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.550770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.551217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.551246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.551268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.551294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.553943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.554003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.554050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.554098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.554576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.554600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.555030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.555087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.555154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.555216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.555699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.555737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.555758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.555784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.558192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.558248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.558298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.558350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.558778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.558806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.558861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.558924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.558978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.559038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.559523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.990 [2024-07-12 14:00:56.559546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.559567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.559592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.561937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.561995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.562044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.562094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.562543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.562570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.562625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.562673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.562749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:07.991 [2024-07-12 14:00:56.562797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.563228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.563263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.563285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.563307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.565584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.565640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.565691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.565745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.566193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.566222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.566281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.566330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.566378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.566440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.566796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.566820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.566846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.566869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.569190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.569241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.569295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.569347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.569785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.569809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.569866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.569918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.569975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.570050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.570404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.570429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.570451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.570479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.572772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.572824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.572871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.572924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.573398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.573426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.573485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.573534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.252 [2024-07-12 14:00:56.573585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.573663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.574064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.574089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.574111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.574135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.576461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.576513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.576566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.576617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.577061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.577087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.577146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.577201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.577287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.577336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.577779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.577813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.577835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.577856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.580133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.580189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.580237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.580289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.580739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.580764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.580822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.580893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.580947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.581025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.581513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.581536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.581556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.581589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.583966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.584023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.584081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.584129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.584574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.584597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.584664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.584712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.584777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.584854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.585364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.585389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.585411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.585431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.587870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.587932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.587981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.588032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.588447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.588471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.588540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.588623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.588683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.588744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.589152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.589175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.589196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.589216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.591709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.591764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.591816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.591864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.592231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.592255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.592320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.592367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.592413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.592459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.592737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.592759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.253 [2024-07-12 14:00:56.592780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.592800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.594578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.594630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.594686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.594739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.595024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.595047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.595101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.595148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.595204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.595251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.595527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.595554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.595576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.595596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.598030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.598086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.598143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.598192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.598524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.598549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.598607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.598654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.598703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.598750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.599093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.599117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.599137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.599157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.600878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.600947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.601903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.604380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.604434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.604487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.604537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.604891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.604916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.604990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.605041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.605101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.605150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.605483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.605508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.605531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.605559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.608093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.608147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.608208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.608254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.608648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.608672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.608739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.608801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.608848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.608907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.609400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.609423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.609444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.609465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.612176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.612243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.612307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.612371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.612768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.612791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.612877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.612935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.254 [2024-07-12 14:00:56.612997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.613048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.613407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.613431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.613454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.613475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.615770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.615822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.615878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.615934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.616265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.616290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.616357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.616416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.616464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.616512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.616961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.616985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.617007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.617029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.619308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.619363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.619423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.619470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.619856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.619884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.619956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.620021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.620088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.620135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.620641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.620665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.620690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.620716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.623044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.623124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.623181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.623229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.623598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.623633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.623700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.623748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.623796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.623844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.624274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.624301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.624322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.624342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.626761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.626813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.626864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.626933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.627386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.627410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.627477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.627531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.627592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.627642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.628000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.628025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.628047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.628068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.630378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.630444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.630493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.630541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.631031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.631057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.631118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.631168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.631235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.631289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.631827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.631855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.631877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.631902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.634134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.634190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.634241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.634292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.634637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.634670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.634736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.635142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.635197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.255 [2024-07-12 14:00:56.635248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.635611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.635636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.635658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.635680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.637996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.638047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.638099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.638152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.638606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.638629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.638698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.638751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.638826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.638901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.639318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.639343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.639364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.639384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.642221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.642280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.642329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.642392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.642733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.642757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.642824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.642875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.642941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.642990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.643379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.643403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.643425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.643449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.645743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.646153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.646230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.646630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.647001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.647038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.647111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.647509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.647560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.647964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.648402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.648425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.648446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.648474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.650939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.651341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.651406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.651805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.652217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.652241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.652309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.652719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.652775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.653178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.653565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.653587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.653608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.653640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.656065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.656469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.656531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.658007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.658354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.658378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.658447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.658848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.658909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.659310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.659712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.659735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.659757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.659777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.662210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.256 [2024-07-12 14:00:56.662616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.662667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.663075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.663538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.663561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.663621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.665480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.665530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.667459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.667742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.667764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.667784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.667805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.669546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.671363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.671416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.673248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.673671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.673693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.673765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.674165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.674232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.674631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.675027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.675050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.675070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.675090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.676646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.678114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.678166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.679636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.679973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.679995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.680060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.681735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.681787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.683451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.683732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.683754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.683775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.683794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.686321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.687919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.687982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.689892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.690180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.690203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.690267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.691947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.692000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.692621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.692901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.692924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.257 [2024-07-12 14:00:56.692950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.692971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.694562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.695054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.695457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.695854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.696296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.696319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.696380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.698259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.699911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.701585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.701868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.701890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.701911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.701937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.705505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.707184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.707893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.708307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.708703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.708728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.709146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.711046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.712353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.714029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.714311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.714334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.714355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.714375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.720619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.721032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.721550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:08.258 [2024-07-12 14:00:56.721574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:09.203 00:35:09.203 Latency(us) 00:35:09.203 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:09.203 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:09.203 Verification LBA range: start 0x0 length 0x100 00:35:09.203 crypto_ram : 6.12 41.83 2.61 0.00 0.00 2980979.76 237069.36 2494699.07 00:35:09.203 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:09.203 Verification LBA range: start 0x100 length 0x100 00:35:09.203 crypto_ram : 5.96 32.22 2.01 0.00 0.00 3655330.95 69753.10 3122021.06 00:35:09.203 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:09.203 Verification LBA range: start 0x0 length 0x100 00:35:09.203 crypto_ram1 : 6.12 41.82 2.61 0.00 0.00 2877026.84 235245.75 2290454.71 00:35:09.203 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:09.203 Verification LBA range: start 0x100 length 0x100 00:35:09.203 crypto_ram1 : 5.98 34.77 2.17 0.00 0.00 3292456.31 62458.66 2859421.16 00:35:09.203 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:09.203 Verification LBA range: start 0x0 length 0x100 00:35:09.203 crypto_ram2 : 5.66 252.58 15.79 0.00 0.00 448824.97 14189.97 598144.22 00:35:09.203 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:09.203 Verification LBA range: start 0x100 length 0x100 00:35:09.203 crypto_ram2 : 5.71 210.32 13.15 0.00 0.00 530850.05 33280.89 671088.64 00:35:09.203 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:09.203 Verification LBA range: start 0x0 length 0x100 00:35:09.203 crypto_ram3 : 5.78 265.44 16.59 0.00 0.00 417067.82 48781.58 477785.93 00:35:09.203 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:09.203 Verification LBA range: start 0x100 length 0x100 00:35:09.203 crypto_ram3 : 5.86 228.83 14.30 0.00 0.00 473474.17 4758.48 612733.11 00:35:09.203 =================================================================================================================== 00:35:09.203 Total : 1107.81 69.24 0.00 0.00 847846.07 4758.48 3122021.06 00:35:09.465 00:35:09.465 real 0m9.378s 00:35:09.465 user 0m17.719s 00:35:09.465 sys 0m0.498s 00:35:09.465 14:00:57 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:09.465 14:00:57 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:35:09.465 ************************************ 00:35:09.465 END TEST bdev_verify_big_io 00:35:09.465 ************************************ 00:35:09.465 14:00:57 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:09.465 14:00:57 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:09.465 14:00:57 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:09.465 14:00:57 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:09.465 14:00:57 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:09.465 ************************************ 00:35:09.465 START TEST bdev_write_zeroes 00:35:09.465 ************************************ 00:35:09.465 14:00:57 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:09.465 [2024-07-12 14:00:58.000341] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:09.465 [2024-07-12 14:00:58.000403] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid641585 ] 00:35:09.724 [2024-07-12 14:00:58.127619] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:09.724 [2024-07-12 14:00:58.228881] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:09.724 [2024-07-12 14:00:58.250182] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:09.724 [2024-07-12 14:00:58.258212] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:09.724 [2024-07-12 14:00:58.266231] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:09.983 [2024-07-12 14:00:58.376449] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:12.516 [2024-07-12 14:01:00.602570] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:12.516 [2024-07-12 14:01:00.602649] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:12.516 [2024-07-12 14:01:00.602665] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:12.516 [2024-07-12 14:01:00.610575] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:12.516 [2024-07-12 14:01:00.610595] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:12.516 [2024-07-12 14:01:00.610607] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:12.516 [2024-07-12 14:01:00.618595] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:12.516 [2024-07-12 14:01:00.618613] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:12.516 [2024-07-12 14:01:00.618625] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:12.516 [2024-07-12 14:01:00.626616] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:12.516 [2024-07-12 14:01:00.626634] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:12.516 [2024-07-12 14:01:00.626645] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:12.516 Running I/O for 1 seconds... 00:35:13.451 00:35:13.451 Latency(us) 00:35:13.451 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:13.451 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:13.451 crypto_ram : 1.03 2021.46 7.90 0.00 0.00 62873.32 5584.81 76135.74 00:35:13.451 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:13.451 crypto_ram1 : 1.03 2034.55 7.95 0.00 0.00 62170.48 5556.31 70209.00 00:35:13.451 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:13.451 crypto_ram2 : 1.02 15567.10 60.81 0.00 0.00 8095.21 2436.23 10656.72 00:35:13.451 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:13.451 crypto_ram3 : 1.02 15599.14 60.93 0.00 0.00 8052.73 2421.98 8377.21 00:35:13.451 =================================================================================================================== 00:35:13.451 Total : 35222.25 137.59 0.00 0.00 14371.13 2421.98 76135.74 00:35:13.709 00:35:13.709 real 0m4.203s 00:35:13.709 user 0m3.762s 00:35:13.709 sys 0m0.383s 00:35:13.709 14:01:02 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:13.709 14:01:02 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:35:13.709 ************************************ 00:35:13.709 END TEST bdev_write_zeroes 00:35:13.709 ************************************ 00:35:13.709 14:01:02 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:13.709 14:01:02 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:13.709 14:01:02 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:13.709 14:01:02 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:13.709 14:01:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:13.709 ************************************ 00:35:13.709 START TEST bdev_json_nonenclosed 00:35:13.709 ************************************ 00:35:13.709 14:01:02 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:13.709 [2024-07-12 14:01:02.281022] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:13.709 [2024-07-12 14:01:02.281103] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid642128 ] 00:35:13.967 [2024-07-12 14:01:02.425164] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:13.967 [2024-07-12 14:01:02.536199] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:13.967 [2024-07-12 14:01:02.536268] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:35:13.967 [2024-07-12 14:01:02.536290] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:13.967 [2024-07-12 14:01:02.536302] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:14.226 00:35:14.226 real 0m0.427s 00:35:14.226 user 0m0.259s 00:35:14.226 sys 0m0.165s 00:35:14.226 14:01:02 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:35:14.226 14:01:02 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:14.226 14:01:02 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:35:14.226 ************************************ 00:35:14.226 END TEST bdev_json_nonenclosed 00:35:14.226 ************************************ 00:35:14.226 14:01:02 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:14.226 14:01:02 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:35:14.226 14:01:02 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:14.226 14:01:02 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:14.226 14:01:02 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:14.226 14:01:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:14.226 ************************************ 00:35:14.226 START TEST bdev_json_nonarray 00:35:14.226 ************************************ 00:35:14.226 14:01:02 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:14.486 [2024-07-12 14:01:02.843543] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:14.486 [2024-07-12 14:01:02.843673] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid642267 ] 00:35:14.486 [2024-07-12 14:01:03.040305] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:14.744 [2024-07-12 14:01:03.142628] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:14.744 [2024-07-12 14:01:03.142705] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:35:14.744 [2024-07-12 14:01:03.142727] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:14.744 [2024-07-12 14:01:03.142739] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:14.744 00:35:14.744 real 0m0.511s 00:35:14.744 user 0m0.297s 00:35:14.744 sys 0m0.209s 00:35:14.744 14:01:03 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:35:14.744 14:01:03 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:14.744 14:01:03 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:35:14.744 ************************************ 00:35:14.744 END TEST bdev_json_nonarray 00:35:14.744 ************************************ 00:35:14.744 14:01:03 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:35:14.744 14:01:03 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:35:14.744 00:35:14.744 real 1m14.245s 00:35:14.744 user 2m44.020s 00:35:14.744 sys 0m9.748s 00:35:14.744 14:01:03 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:14.744 14:01:03 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:14.744 ************************************ 00:35:14.744 END TEST blockdev_crypto_qat 00:35:14.744 ************************************ 00:35:15.003 14:01:03 -- common/autotest_common.sh@1142 -- # return 0 00:35:15.003 14:01:03 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:15.003 14:01:03 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:35:15.003 14:01:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:15.003 14:01:03 -- common/autotest_common.sh@10 -- # set +x 00:35:15.003 ************************************ 00:35:15.003 START TEST chaining 00:35:15.003 ************************************ 00:35:15.003 14:01:03 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:15.003 * Looking for test storage... 00:35:15.003 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:15.003 14:01:03 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@7 -- # uname -s 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:15.003 14:01:03 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:35:15.003 14:01:03 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:15.003 14:01:03 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:15.003 14:01:03 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:15.003 14:01:03 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:15.003 14:01:03 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:15.003 14:01:03 chaining -- paths/export.sh@5 -- # export PATH 00:35:15.003 14:01:03 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@47 -- # : 0 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:35:15.003 14:01:03 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:35:15.003 14:01:03 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:35:15.003 14:01:03 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:35:15.003 14:01:03 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:35:15.003 14:01:03 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:35:15.003 14:01:03 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:15.003 14:01:03 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:15.003 14:01:03 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:15.003 14:01:03 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:15.003 14:01:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:23.130 14:01:11 chaining -- nvmf/common.sh@336 -- # return 1 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:23.131 WARNING: No supported devices were found, fallback requested for tcp test 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:23.131 Cannot find device "nvmf_init_br" 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@154 -- # true 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:23.131 Cannot find device "nvmf_tgt_br" 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@155 -- # true 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:23.131 Cannot find device "nvmf_tgt_br2" 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@156 -- # true 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:23.131 Cannot find device "nvmf_init_br" 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@157 -- # true 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:23.131 Cannot find device "nvmf_tgt_br" 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@158 -- # true 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:23.131 Cannot find device "nvmf_tgt_br2" 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@159 -- # true 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:23.131 Cannot find device "nvmf_br" 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@160 -- # true 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:23.131 Cannot find device "nvmf_init_if" 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@161 -- # true 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:23.131 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@162 -- # true 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:23.131 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@163 -- # true 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:23.131 14:01:11 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:23.390 14:01:11 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:23.390 14:01:11 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:23.390 14:01:11 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:23.391 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:23.391 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.095 ms 00:35:23.391 00:35:23.391 --- 10.0.0.2 ping statistics --- 00:35:23.391 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:23.391 rtt min/avg/max/mdev = 0.095/0.095/0.095/0.000 ms 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:23.391 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:23.391 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.070 ms 00:35:23.391 00:35:23.391 --- 10.0.0.3 ping statistics --- 00:35:23.391 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:23.391 rtt min/avg/max/mdev = 0.070/0.070/0.070/0.000 ms 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:23.391 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:23.391 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.052 ms 00:35:23.391 00:35:23.391 --- 10.0.0.1 ping statistics --- 00:35:23.391 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:23.391 rtt min/avg/max/mdev = 0.052/0.052/0.052/0.000 ms 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@433 -- # return 0 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:23.391 14:01:11 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:23.391 14:01:11 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:23.391 14:01:11 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@481 -- # nvmfpid=645997 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:23.391 14:01:11 chaining -- nvmf/common.sh@482 -- # waitforlisten 645997 00:35:23.391 14:01:11 chaining -- common/autotest_common.sh@829 -- # '[' -z 645997 ']' 00:35:23.391 14:01:11 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:23.391 14:01:11 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:23.391 14:01:11 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:23.391 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:23.391 14:01:11 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:23.391 14:01:11 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:23.650 [2024-07-12 14:01:12.006289] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:23.650 [2024-07-12 14:01:12.006359] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:23.650 [2024-07-12 14:01:12.148943] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:23.910 [2024-07-12 14:01:12.265474] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:23.910 [2024-07-12 14:01:12.265530] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:23.910 [2024-07-12 14:01:12.265552] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:23.910 [2024-07-12 14:01:12.265569] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:23.910 [2024-07-12 14:01:12.265583] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:23.910 [2024-07-12 14:01:12.265624] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:24.479 14:01:12 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:24.479 14:01:12 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:24.479 14:01:12 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:24.479 14:01:12 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:24.479 14:01:12 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:24.479 14:01:12 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:24.479 14:01:12 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:24.479 14:01:12 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.XDXR78mgGM 00:35:24.479 14:01:12 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:24.479 14:01:12 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.9RtUsbjneJ 00:35:24.479 14:01:12 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:24.479 14:01:12 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:35:24.479 14:01:12 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.479 14:01:12 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:24.479 malloc0 00:35:24.479 true 00:35:24.480 true 00:35:24.480 [2024-07-12 14:01:13.037052] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:24.480 crypto0 00:35:24.480 [2024-07-12 14:01:13.045082] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:24.480 crypto1 00:35:24.480 [2024-07-12 14:01:13.053225] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:24.738 [2024-07-12 14:01:13.069483] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:24.738 14:01:13 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@85 -- # update_stats 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:24.738 14:01:13 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:24.738 14:01:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:24.738 14:01:13 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:24.738 14:01:13 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.738 14:01:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:24.738 14:01:13 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:24.738 14:01:13 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:24.739 14:01:13 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.739 14:01:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:24.739 14:01:13 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:24.739 14:01:13 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.739 14:01:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:24.739 14:01:13 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.XDXR78mgGM bs=1K count=64 00:35:24.739 64+0 records in 00:35:24.739 64+0 records out 00:35:24.739 65536 bytes (66 kB, 64 KiB) copied, 0.00107574 s, 60.9 MB/s 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.XDXR78mgGM --ob Nvme0n1 --bs 65536 --count 1 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@25 -- # local config 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:24.739 14:01:13 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:24.739 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:24.997 14:01:13 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:24.997 "subsystems": [ 00:35:24.997 { 00:35:24.997 "subsystem": "bdev", 00:35:24.997 "config": [ 00:35:24.997 { 00:35:24.997 "method": "bdev_nvme_attach_controller", 00:35:24.997 "params": { 00:35:24.997 "trtype": "tcp", 00:35:24.997 "adrfam": "IPv4", 00:35:24.997 "name": "Nvme0", 00:35:24.997 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:24.997 "traddr": "10.0.0.2", 00:35:24.997 "trsvcid": "4420" 00:35:24.997 } 00:35:24.997 }, 00:35:24.997 { 00:35:24.997 "method": "bdev_set_options", 00:35:24.997 "params": { 00:35:24.997 "bdev_auto_examine": false 00:35:24.997 } 00:35:24.997 } 00:35:24.997 ] 00:35:24.997 } 00:35:24.997 ] 00:35:24.997 }' 00:35:24.997 14:01:13 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.XDXR78mgGM --ob Nvme0n1 --bs 65536 --count 1 00:35:24.997 14:01:13 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:24.997 "subsystems": [ 00:35:24.997 { 00:35:24.997 "subsystem": "bdev", 00:35:24.997 "config": [ 00:35:24.997 { 00:35:24.997 "method": "bdev_nvme_attach_controller", 00:35:24.997 "params": { 00:35:24.997 "trtype": "tcp", 00:35:24.997 "adrfam": "IPv4", 00:35:24.997 "name": "Nvme0", 00:35:24.997 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:24.997 "traddr": "10.0.0.2", 00:35:24.997 "trsvcid": "4420" 00:35:24.997 } 00:35:24.997 }, 00:35:24.997 { 00:35:24.997 "method": "bdev_set_options", 00:35:24.997 "params": { 00:35:24.997 "bdev_auto_examine": false 00:35:24.997 } 00:35:24.997 } 00:35:24.997 ] 00:35:24.997 } 00:35:24.997 ] 00:35:24.997 }' 00:35:24.997 [2024-07-12 14:01:13.383054] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:24.997 [2024-07-12 14:01:13.383105] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646220 ] 00:35:24.997 [2024-07-12 14:01:13.497218] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:25.256 [2024-07-12 14:01:13.602822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:25.515  Copying: 64/64 [kB] (average 10 MBps) 00:35:25.515 00:35:25.515 14:01:14 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:35:25.515 14:01:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.515 14:01:14 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:25.515 14:01:14 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:25.515 14:01:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.515 14:01:14 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:25.515 14:01:14 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:25.515 14:01:14 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:25.515 14:01:14 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.515 14:01:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.515 14:01:14 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:25.774 14:01:14 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.774 14:01:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.774 14:01:14 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:25.774 14:01:14 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.774 14:01:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.774 14:01:14 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:25.774 14:01:14 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.774 14:01:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:25.774 14:01:14 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.774 14:01:14 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@96 -- # update_stats 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:25.775 14:01:14 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.775 14:01:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.775 14:01:14 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:25.775 14:01:14 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.775 14:01:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.775 14:01:14 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.775 14:01:14 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:26.034 14:01:14 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:26.034 14:01:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.034 14:01:14 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:26.034 14:01:14 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:26.034 14:01:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:26.034 14:01:14 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.9RtUsbjneJ --ib Nvme0n1 --bs 65536 --count 1 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@25 -- # local config 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:26.034 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:26.034 "subsystems": [ 00:35:26.034 { 00:35:26.034 "subsystem": "bdev", 00:35:26.034 "config": [ 00:35:26.034 { 00:35:26.034 "method": "bdev_nvme_attach_controller", 00:35:26.034 "params": { 00:35:26.034 "trtype": "tcp", 00:35:26.034 "adrfam": "IPv4", 00:35:26.034 "name": "Nvme0", 00:35:26.034 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:26.034 "traddr": "10.0.0.2", 00:35:26.034 "trsvcid": "4420" 00:35:26.034 } 00:35:26.034 }, 00:35:26.034 { 00:35:26.034 "method": "bdev_set_options", 00:35:26.034 "params": { 00:35:26.034 "bdev_auto_examine": false 00:35:26.034 } 00:35:26.034 } 00:35:26.034 ] 00:35:26.034 } 00:35:26.034 ] 00:35:26.034 }' 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.9RtUsbjneJ --ib Nvme0n1 --bs 65536 --count 1 00:35:26.034 14:01:14 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:26.034 "subsystems": [ 00:35:26.034 { 00:35:26.034 "subsystem": "bdev", 00:35:26.034 "config": [ 00:35:26.034 { 00:35:26.034 "method": "bdev_nvme_attach_controller", 00:35:26.034 "params": { 00:35:26.034 "trtype": "tcp", 00:35:26.034 "adrfam": "IPv4", 00:35:26.034 "name": "Nvme0", 00:35:26.034 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:26.034 "traddr": "10.0.0.2", 00:35:26.034 "trsvcid": "4420" 00:35:26.034 } 00:35:26.034 }, 00:35:26.034 { 00:35:26.034 "method": "bdev_set_options", 00:35:26.034 "params": { 00:35:26.034 "bdev_auto_examine": false 00:35:26.034 } 00:35:26.034 } 00:35:26.034 ] 00:35:26.034 } 00:35:26.034 ] 00:35:26.034 }' 00:35:26.034 [2024-07-12 14:01:14.560599] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:26.034 [2024-07-12 14:01:14.560672] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646430 ] 00:35:26.293 [2024-07-12 14:01:14.691556] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:26.293 [2024-07-12 14:01:14.792998] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:26.812  Copying: 64/64 [kB] (average 10 MBps) 00:35:26.812 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.812 14:01:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:26.812 14:01:15 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:27.072 14:01:15 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:35:27.072 14:01:15 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.XDXR78mgGM /tmp/tmp.9RtUsbjneJ 00:35:27.072 14:01:15 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:35:27.072 14:01:15 chaining -- bdev/chaining.sh@25 -- # local config 00:35:27.072 14:01:15 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:27.072 14:01:15 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:27.072 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:27.072 14:01:15 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:27.072 "subsystems": [ 00:35:27.072 { 00:35:27.072 "subsystem": "bdev", 00:35:27.072 "config": [ 00:35:27.072 { 00:35:27.072 "method": "bdev_nvme_attach_controller", 00:35:27.072 "params": { 00:35:27.072 "trtype": "tcp", 00:35:27.072 "adrfam": "IPv4", 00:35:27.072 "name": "Nvme0", 00:35:27.072 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:27.072 "traddr": "10.0.0.2", 00:35:27.073 "trsvcid": "4420" 00:35:27.073 } 00:35:27.073 }, 00:35:27.073 { 00:35:27.073 "method": "bdev_set_options", 00:35:27.073 "params": { 00:35:27.073 "bdev_auto_examine": false 00:35:27.073 } 00:35:27.073 } 00:35:27.073 ] 00:35:27.073 } 00:35:27.073 ] 00:35:27.073 }' 00:35:27.073 14:01:15 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:35:27.073 14:01:15 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:27.073 "subsystems": [ 00:35:27.073 { 00:35:27.073 "subsystem": "bdev", 00:35:27.073 "config": [ 00:35:27.073 { 00:35:27.073 "method": "bdev_nvme_attach_controller", 00:35:27.073 "params": { 00:35:27.073 "trtype": "tcp", 00:35:27.073 "adrfam": "IPv4", 00:35:27.073 "name": "Nvme0", 00:35:27.073 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:27.073 "traddr": "10.0.0.2", 00:35:27.073 "trsvcid": "4420" 00:35:27.073 } 00:35:27.073 }, 00:35:27.073 { 00:35:27.073 "method": "bdev_set_options", 00:35:27.073 "params": { 00:35:27.073 "bdev_auto_examine": false 00:35:27.073 } 00:35:27.073 } 00:35:27.073 ] 00:35:27.073 } 00:35:27.073 ] 00:35:27.073 }' 00:35:27.073 [2024-07-12 14:01:15.533582] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:27.073 [2024-07-12 14:01:15.533649] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646624 ] 00:35:27.332 [2024-07-12 14:01:15.665298] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:27.332 [2024-07-12 14:01:15.768783] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:27.852  Copying: 64/64 [kB] (average 15 MBps) 00:35:27.852 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@106 -- # update_stats 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:27.852 14:01:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.XDXR78mgGM --ob Nvme0n1 --bs 4096 --count 16 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@25 -- # local config 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:27.852 14:01:16 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:27.852 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:28.111 14:01:16 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:28.111 "subsystems": [ 00:35:28.111 { 00:35:28.111 "subsystem": "bdev", 00:35:28.111 "config": [ 00:35:28.111 { 00:35:28.111 "method": "bdev_nvme_attach_controller", 00:35:28.111 "params": { 00:35:28.111 "trtype": "tcp", 00:35:28.111 "adrfam": "IPv4", 00:35:28.111 "name": "Nvme0", 00:35:28.111 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:28.111 "traddr": "10.0.0.2", 00:35:28.111 "trsvcid": "4420" 00:35:28.111 } 00:35:28.111 }, 00:35:28.112 { 00:35:28.112 "method": "bdev_set_options", 00:35:28.112 "params": { 00:35:28.112 "bdev_auto_examine": false 00:35:28.112 } 00:35:28.112 } 00:35:28.112 ] 00:35:28.112 } 00:35:28.112 ] 00:35:28.112 }' 00:35:28.112 14:01:16 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.XDXR78mgGM --ob Nvme0n1 --bs 4096 --count 16 00:35:28.112 14:01:16 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:28.112 "subsystems": [ 00:35:28.112 { 00:35:28.112 "subsystem": "bdev", 00:35:28.112 "config": [ 00:35:28.112 { 00:35:28.112 "method": "bdev_nvme_attach_controller", 00:35:28.112 "params": { 00:35:28.112 "trtype": "tcp", 00:35:28.112 "adrfam": "IPv4", 00:35:28.112 "name": "Nvme0", 00:35:28.112 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:28.112 "traddr": "10.0.0.2", 00:35:28.112 "trsvcid": "4420" 00:35:28.112 } 00:35:28.112 }, 00:35:28.112 { 00:35:28.112 "method": "bdev_set_options", 00:35:28.112 "params": { 00:35:28.112 "bdev_auto_examine": false 00:35:28.112 } 00:35:28.112 } 00:35:28.112 ] 00:35:28.112 } 00:35:28.112 ] 00:35:28.112 }' 00:35:28.112 [2024-07-12 14:01:16.498217] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:28.112 [2024-07-12 14:01:16.498292] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646665 ] 00:35:28.112 [2024-07-12 14:01:16.632046] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:28.371 [2024-07-12 14:01:16.734269] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:28.630  Copying: 64/64 [kB] (average 8000 kBps) 00:35:28.630 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:28.630 14:01:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.630 14:01:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.630 14:01:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:28.630 14:01:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.630 14:01:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.630 14:01:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@114 -- # update_stats 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.889 14:01:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:28.889 14:01:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.148 14:01:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.148 14:01:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.148 14:01:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@117 -- # : 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.9RtUsbjneJ --ib Nvme0n1 --bs 4096 --count 16 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@25 -- # local config 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:29.148 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:29.148 "subsystems": [ 00:35:29.148 { 00:35:29.148 "subsystem": "bdev", 00:35:29.148 "config": [ 00:35:29.148 { 00:35:29.148 "method": "bdev_nvme_attach_controller", 00:35:29.148 "params": { 00:35:29.148 "trtype": "tcp", 00:35:29.148 "adrfam": "IPv4", 00:35:29.148 "name": "Nvme0", 00:35:29.148 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:29.148 "traddr": "10.0.0.2", 00:35:29.148 "trsvcid": "4420" 00:35:29.148 } 00:35:29.148 }, 00:35:29.148 { 00:35:29.148 "method": "bdev_set_options", 00:35:29.148 "params": { 00:35:29.148 "bdev_auto_examine": false 00:35:29.148 } 00:35:29.148 } 00:35:29.148 ] 00:35:29.148 } 00:35:29.148 ] 00:35:29.148 }' 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.9RtUsbjneJ --ib Nvme0n1 --bs 4096 --count 16 00:35:29.148 14:01:17 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:29.148 "subsystems": [ 00:35:29.148 { 00:35:29.148 "subsystem": "bdev", 00:35:29.148 "config": [ 00:35:29.148 { 00:35:29.148 "method": "bdev_nvme_attach_controller", 00:35:29.148 "params": { 00:35:29.148 "trtype": "tcp", 00:35:29.148 "adrfam": "IPv4", 00:35:29.148 "name": "Nvme0", 00:35:29.148 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:29.148 "traddr": "10.0.0.2", 00:35:29.148 "trsvcid": "4420" 00:35:29.148 } 00:35:29.148 }, 00:35:29.148 { 00:35:29.148 "method": "bdev_set_options", 00:35:29.148 "params": { 00:35:29.148 "bdev_auto_examine": false 00:35:29.148 } 00:35:29.148 } 00:35:29.148 ] 00:35:29.148 } 00:35:29.148 ] 00:35:29.148 }' 00:35:29.148 [2024-07-12 14:01:17.625169] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:29.148 [2024-07-12 14:01:17.625239] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid646870 ] 00:35:29.407 [2024-07-12 14:01:17.756651] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:29.407 [2024-07-12 14:01:17.860396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:29.925  Copying: 64/64 [kB] (average 1306 kBps) 00:35:29.925 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:29.925 14:01:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.925 14:01:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.925 14:01:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.925 14:01:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.925 14:01:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.925 14:01:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.925 14:01:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.925 14:01:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:29.925 14:01:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:29.925 14:01:18 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:29.926 14:01:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:29.926 14:01:18 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:29.926 14:01:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:29.926 14:01:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:29.926 14:01:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:29.926 14:01:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:30.185 14:01:18 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:35:30.185 14:01:18 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.XDXR78mgGM /tmp/tmp.9RtUsbjneJ 00:35:30.185 14:01:18 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:35:30.185 14:01:18 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:35:30.185 14:01:18 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.XDXR78mgGM /tmp/tmp.9RtUsbjneJ 00:35:30.185 14:01:18 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@117 -- # sync 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@120 -- # set +e 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:30.185 rmmod nvme_tcp 00:35:30.185 rmmod nvme_fabrics 00:35:30.185 rmmod nvme_keyring 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@124 -- # set -e 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@125 -- # return 0 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@489 -- # '[' -n 645997 ']' 00:35:30.185 14:01:18 chaining -- nvmf/common.sh@490 -- # killprocess 645997 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@948 -- # '[' -z 645997 ']' 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@952 -- # kill -0 645997 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@953 -- # uname 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 645997 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 645997' 00:35:30.185 killing process with pid 645997 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@967 -- # kill 645997 00:35:30.185 14:01:18 chaining -- common/autotest_common.sh@972 -- # wait 645997 00:35:30.444 14:01:18 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:30.444 14:01:18 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:30.444 14:01:18 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:30.444 14:01:18 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:30.444 14:01:18 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:30.444 14:01:18 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:30.444 14:01:18 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:30.444 14:01:18 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:30.444 14:01:18 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:35:30.444 14:01:18 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:30.444 14:01:18 chaining -- bdev/chaining.sh@132 -- # bperfpid=647080 00:35:30.444 14:01:18 chaining -- bdev/chaining.sh@134 -- # waitforlisten 647080 00:35:30.444 14:01:18 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:30.444 14:01:18 chaining -- common/autotest_common.sh@829 -- # '[' -z 647080 ']' 00:35:30.444 14:01:18 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:30.444 14:01:18 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:30.444 14:01:18 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:30.444 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:30.444 14:01:18 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:30.444 14:01:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:30.703 [2024-07-12 14:01:19.054823] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:30.703 [2024-07-12 14:01:19.054894] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647080 ] 00:35:30.703 [2024-07-12 14:01:19.177979] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:30.962 [2024-07-12 14:01:19.287485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:31.550 14:01:19 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:31.550 14:01:19 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:31.550 14:01:19 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:35:31.550 14:01:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:31.550 14:01:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:31.550 malloc0 00:35:31.550 true 00:35:31.550 true 00:35:31.550 [2024-07-12 14:01:20.101271] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:31.550 crypto0 00:35:31.550 [2024-07-12 14:01:20.109298] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:31.809 crypto1 00:35:31.809 14:01:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:31.809 14:01:20 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:31.809 Running I/O for 5 seconds... 00:35:37.077 00:35:37.077 Latency(us) 00:35:37.077 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:37.077 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:37.077 Verification LBA range: start 0x0 length 0x2000 00:35:37.077 crypto1 : 5.01 11457.26 44.75 0.00 0.00 22277.02 1389.08 14303.94 00:35:37.077 =================================================================================================================== 00:35:37.077 Total : 11457.26 44.75 0.00 0.00 22277.02 1389.08 14303.94 00:35:37.077 0 00:35:37.077 14:01:25 chaining -- bdev/chaining.sh@146 -- # killprocess 647080 00:35:37.077 14:01:25 chaining -- common/autotest_common.sh@948 -- # '[' -z 647080 ']' 00:35:37.077 14:01:25 chaining -- common/autotest_common.sh@952 -- # kill -0 647080 00:35:37.077 14:01:25 chaining -- common/autotest_common.sh@953 -- # uname 00:35:37.077 14:01:25 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:37.077 14:01:25 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 647080 00:35:37.077 14:01:25 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:37.077 14:01:25 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:37.077 14:01:25 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 647080' 00:35:37.077 killing process with pid 647080 00:35:37.077 14:01:25 chaining -- common/autotest_common.sh@967 -- # kill 647080 00:35:37.077 Received shutdown signal, test time was about 5.000000 seconds 00:35:37.077 00:35:37.077 Latency(us) 00:35:37.077 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:37.078 =================================================================================================================== 00:35:37.078 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:37.078 14:01:25 chaining -- common/autotest_common.sh@972 -- # wait 647080 00:35:37.078 14:01:25 chaining -- bdev/chaining.sh@152 -- # bperfpid=647956 00:35:37.078 14:01:25 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:37.078 14:01:25 chaining -- bdev/chaining.sh@154 -- # waitforlisten 647956 00:35:37.078 14:01:25 chaining -- common/autotest_common.sh@829 -- # '[' -z 647956 ']' 00:35:37.078 14:01:25 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:37.078 14:01:25 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:37.078 14:01:25 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:37.078 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:37.078 14:01:25 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:37.078 14:01:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:37.337 [2024-07-12 14:01:25.666949] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:37.337 [2024-07-12 14:01:25.667088] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647956 ] 00:35:37.337 [2024-07-12 14:01:25.863421] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:37.596 [2024-07-12 14:01:25.966154] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:37.597 14:01:26 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:37.597 14:01:26 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:37.597 14:01:26 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:35:37.597 14:01:26 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:37.597 14:01:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:37.856 malloc0 00:35:37.856 true 00:35:37.856 true 00:35:37.856 [2024-07-12 14:01:26.221751] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:35:37.856 [2024-07-12 14:01:26.221803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:37.856 [2024-07-12 14:01:26.221827] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1489030 00:35:37.856 [2024-07-12 14:01:26.221841] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:37.856 [2024-07-12 14:01:26.223021] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:37.856 [2024-07-12 14:01:26.223047] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:35:37.856 pt0 00:35:37.856 [2024-07-12 14:01:26.229781] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:37.856 crypto0 00:35:37.856 [2024-07-12 14:01:26.237802] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:37.856 crypto1 00:35:37.856 14:01:26 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:37.856 14:01:26 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:38.115 Running I/O for 5 seconds... 00:35:43.501 00:35:43.501 Latency(us) 00:35:43.501 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:43.501 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:43.501 Verification LBA range: start 0x0 length 0x2000 00:35:43.501 crypto1 : 5.02 9070.31 35.43 0.00 0.00 28141.31 6610.59 17552.25 00:35:43.501 =================================================================================================================== 00:35:43.501 Total : 9070.31 35.43 0.00 0.00 28141.31 6610.59 17552.25 00:35:43.501 0 00:35:43.501 14:01:31 chaining -- bdev/chaining.sh@167 -- # killprocess 647956 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@948 -- # '[' -z 647956 ']' 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@952 -- # kill -0 647956 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@953 -- # uname 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 647956 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 647956' 00:35:43.501 killing process with pid 647956 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@967 -- # kill 647956 00:35:43.501 Received shutdown signal, test time was about 5.000000 seconds 00:35:43.501 00:35:43.501 Latency(us) 00:35:43.501 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:43.501 =================================================================================================================== 00:35:43.501 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@972 -- # wait 647956 00:35:43.501 14:01:31 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:35:43.501 14:01:31 chaining -- bdev/chaining.sh@170 -- # killprocess 647956 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@948 -- # '[' -z 647956 ']' 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@952 -- # kill -0 647956 00:35:43.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (647956) - No such process 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 647956 is not found' 00:35:43.501 Process with pid 647956 is not found 00:35:43.501 14:01:31 chaining -- bdev/chaining.sh@171 -- # wait 647956 00:35:43.501 14:01:31 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:43.501 14:01:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@336 -- # return 1 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:43.501 WARNING: No supported devices were found, fallback requested for tcp test 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:43.501 Cannot find device "nvmf_tgt_br" 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@155 -- # true 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:43.501 Cannot find device "nvmf_tgt_br2" 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@156 -- # true 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:43.501 Cannot find device "nvmf_tgt_br" 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@158 -- # true 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:43.501 Cannot find device "nvmf_tgt_br2" 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@159 -- # true 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:43.501 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@162 -- # true 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:43.501 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@163 -- # true 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:43.501 14:01:31 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:43.501 14:01:32 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:43.501 14:01:32 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:43.501 14:01:32 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:43.501 14:01:32 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:43.760 14:01:32 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:44.020 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:44.020 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.108 ms 00:35:44.020 00:35:44.020 --- 10.0.0.2 ping statistics --- 00:35:44.020 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:44.020 rtt min/avg/max/mdev = 0.108/0.108/0.108/0.000 ms 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:44.020 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:44.020 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.063 ms 00:35:44.020 00:35:44.020 --- 10.0.0.3 ping statistics --- 00:35:44.020 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:44.020 rtt min/avg/max/mdev = 0.063/0.063/0.063/0.000 ms 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:44.020 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:44.020 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.025 ms 00:35:44.020 00:35:44.020 --- 10.0.0.1 ping statistics --- 00:35:44.020 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:44.020 rtt min/avg/max/mdev = 0.025/0.025/0.025/0.000 ms 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@433 -- # return 0 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:44.020 14:01:32 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:44.020 14:01:32 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:44.020 14:01:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@481 -- # nvmfpid=649095 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:44.020 14:01:32 chaining -- nvmf/common.sh@482 -- # waitforlisten 649095 00:35:44.020 14:01:32 chaining -- common/autotest_common.sh@829 -- # '[' -z 649095 ']' 00:35:44.020 14:01:32 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:44.020 14:01:32 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:44.020 14:01:32 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:44.020 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:44.020 14:01:32 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:44.020 14:01:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:44.020 [2024-07-12 14:01:32.479913] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:44.020 [2024-07-12 14:01:32.479993] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:44.281 [2024-07-12 14:01:32.623512] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:44.281 [2024-07-12 14:01:32.754551] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:44.281 [2024-07-12 14:01:32.754616] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:44.281 [2024-07-12 14:01:32.754635] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:44.281 [2024-07-12 14:01:32.754651] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:44.281 [2024-07-12 14:01:32.754665] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:44.281 [2024-07-12 14:01:32.754712] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:44.850 14:01:33 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:44.850 14:01:33 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:44.850 14:01:33 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:44.850 14:01:33 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:44.850 14:01:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:45.109 14:01:33 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:45.109 14:01:33 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:35:45.109 14:01:33 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:45.109 14:01:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:45.109 malloc0 00:35:45.109 [2024-07-12 14:01:33.495224] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:45.109 [2024-07-12 14:01:33.511488] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:45.109 14:01:33 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:45.109 14:01:33 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:35:45.109 14:01:33 chaining -- bdev/chaining.sh@189 -- # bperfpid=649260 00:35:45.109 14:01:33 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:45.109 14:01:33 chaining -- bdev/chaining.sh@191 -- # waitforlisten 649260 /var/tmp/bperf.sock 00:35:45.109 14:01:33 chaining -- common/autotest_common.sh@829 -- # '[' -z 649260 ']' 00:35:45.109 14:01:33 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:45.109 14:01:33 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:45.109 14:01:33 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:45.109 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:45.109 14:01:33 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:45.109 14:01:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:45.109 [2024-07-12 14:01:33.589181] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:45.109 [2024-07-12 14:01:33.589248] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid649260 ] 00:35:45.368 [2024-07-12 14:01:33.719851] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:45.368 [2024-07-12 14:01:33.822047] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:45.937 14:01:34 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:45.937 14:01:34 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:45.937 14:01:34 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:35:45.937 14:01:34 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:46.506 [2024-07-12 14:01:34.793080] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:46.506 nvme0n1 00:35:46.506 true 00:35:46.506 crypto0 00:35:46.506 14:01:34 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:46.506 Running I/O for 5 seconds... 00:35:51.783 00:35:51.783 Latency(us) 00:35:51.783 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:51.783 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:51.783 Verification LBA range: start 0x0 length 0x2000 00:35:51.783 crypto0 : 5.03 6939.10 27.11 0.00 0.00 36763.43 5356.86 34192.70 00:35:51.783 =================================================================================================================== 00:35:51.783 Total : 6939.10 27.11 0.00 0.00 36763.43 5356.86 34192.70 00:35:51.783 0 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@205 -- # sequence=69758 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:51.783 14:01:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@206 -- # encrypt=34879 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:52.351 14:01:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@207 -- # decrypt=34879 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:52.919 14:01:41 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:53.179 14:01:41 chaining -- bdev/chaining.sh@208 -- # crc32c=69758 00:35:53.179 14:01:41 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:35:53.179 14:01:41 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:35:53.179 14:01:41 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:35:53.179 14:01:41 chaining -- bdev/chaining.sh@214 -- # killprocess 649260 00:35:53.179 14:01:41 chaining -- common/autotest_common.sh@948 -- # '[' -z 649260 ']' 00:35:53.179 14:01:41 chaining -- common/autotest_common.sh@952 -- # kill -0 649260 00:35:53.179 14:01:41 chaining -- common/autotest_common.sh@953 -- # uname 00:35:53.179 14:01:41 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:53.179 14:01:41 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 649260 00:35:53.179 14:01:41 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:53.179 14:01:41 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:53.179 14:01:41 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 649260' 00:35:53.179 killing process with pid 649260 00:35:53.179 14:01:41 chaining -- common/autotest_common.sh@967 -- # kill 649260 00:35:53.179 Received shutdown signal, test time was about 5.000000 seconds 00:35:53.179 00:35:53.179 Latency(us) 00:35:53.179 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:53.179 =================================================================================================================== 00:35:53.179 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:53.179 14:01:41 chaining -- common/autotest_common.sh@972 -- # wait 649260 00:35:53.438 14:01:41 chaining -- bdev/chaining.sh@219 -- # bperfpid=650327 00:35:53.438 14:01:41 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:35:53.438 14:01:41 chaining -- bdev/chaining.sh@221 -- # waitforlisten 650327 /var/tmp/bperf.sock 00:35:53.438 14:01:41 chaining -- common/autotest_common.sh@829 -- # '[' -z 650327 ']' 00:35:53.438 14:01:41 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:53.438 14:01:41 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:53.438 14:01:41 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:53.438 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:53.438 14:01:41 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:53.438 14:01:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:53.438 [2024-07-12 14:01:41.948053] Starting SPDK v24.09-pre git sha1 a49cd26ae / DPDK 24.03.0 initialization... 00:35:53.438 [2024-07-12 14:01:41.948124] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid650327 ] 00:35:53.697 [2024-07-12 14:01:42.076528] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:53.697 [2024-07-12 14:01:42.177871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:54.635 14:01:42 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:54.635 14:01:42 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:54.635 14:01:42 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:35:54.635 14:01:42 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:54.893 [2024-07-12 14:01:43.281712] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:54.893 nvme0n1 00:35:54.893 true 00:35:54.893 crypto0 00:35:54.893 14:01:43 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:54.893 Running I/O for 5 seconds... 00:36:00.170 00:36:00.170 Latency(us) 00:36:00.170 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:00.170 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:36:00.170 Verification LBA range: start 0x0 length 0x200 00:36:00.170 crypto0 : 5.01 1658.42 103.65 0.00 0.00 18915.43 1488.81 20629.59 00:36:00.170 =================================================================================================================== 00:36:00.170 Total : 1658.42 103.65 0.00 0.00 18915.43 1488.81 20629.59 00:36:00.170 0 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@233 -- # sequence=16610 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:00.170 14:01:48 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@234 -- # encrypt=8305 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:00.429 14:01:48 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@235 -- # decrypt=8305 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:36:00.687 14:01:49 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:00.946 14:01:49 chaining -- bdev/chaining.sh@236 -- # crc32c=16610 00:36:00.946 14:01:49 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:36:00.946 14:01:49 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:36:00.946 14:01:49 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:36:00.946 14:01:49 chaining -- bdev/chaining.sh@242 -- # killprocess 650327 00:36:00.946 14:01:49 chaining -- common/autotest_common.sh@948 -- # '[' -z 650327 ']' 00:36:00.946 14:01:49 chaining -- common/autotest_common.sh@952 -- # kill -0 650327 00:36:00.946 14:01:49 chaining -- common/autotest_common.sh@953 -- # uname 00:36:00.946 14:01:49 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:00.946 14:01:49 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 650327 00:36:00.946 14:01:49 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:00.946 14:01:49 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:00.946 14:01:49 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 650327' 00:36:00.946 killing process with pid 650327 00:36:00.946 14:01:49 chaining -- common/autotest_common.sh@967 -- # kill 650327 00:36:00.946 Received shutdown signal, test time was about 5.000000 seconds 00:36:00.946 00:36:00.946 Latency(us) 00:36:00.946 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:00.946 =================================================================================================================== 00:36:00.946 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:00.946 14:01:49 chaining -- common/autotest_common.sh@972 -- # wait 650327 00:36:01.205 14:01:49 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:36:01.205 14:01:49 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:36:01.205 14:01:49 chaining -- nvmf/common.sh@117 -- # sync 00:36:01.205 14:01:49 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:36:01.205 14:01:49 chaining -- nvmf/common.sh@120 -- # set +e 00:36:01.205 14:01:49 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:36:01.205 14:01:49 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:36:01.205 rmmod nvme_tcp 00:36:01.205 rmmod nvme_fabrics 00:36:01.464 rmmod nvme_keyring 00:36:01.464 14:01:49 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:36:01.464 14:01:49 chaining -- nvmf/common.sh@124 -- # set -e 00:36:01.464 14:01:49 chaining -- nvmf/common.sh@125 -- # return 0 00:36:01.464 14:01:49 chaining -- nvmf/common.sh@489 -- # '[' -n 649095 ']' 00:36:01.464 14:01:49 chaining -- nvmf/common.sh@490 -- # killprocess 649095 00:36:01.464 14:01:49 chaining -- common/autotest_common.sh@948 -- # '[' -z 649095 ']' 00:36:01.464 14:01:49 chaining -- common/autotest_common.sh@952 -- # kill -0 649095 00:36:01.464 14:01:49 chaining -- common/autotest_common.sh@953 -- # uname 00:36:01.464 14:01:49 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:01.464 14:01:49 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 649095 00:36:01.464 14:01:49 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:36:01.464 14:01:49 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:36:01.464 14:01:49 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 649095' 00:36:01.464 killing process with pid 649095 00:36:01.464 14:01:49 chaining -- common/autotest_common.sh@967 -- # kill 649095 00:36:01.464 14:01:49 chaining -- common/autotest_common.sh@972 -- # wait 649095 00:36:01.723 14:01:50 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:36:01.723 14:01:50 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:36:01.723 14:01:50 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:36:01.723 14:01:50 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:36:01.723 14:01:50 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:36:01.723 14:01:50 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:01.723 14:01:50 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:01.723 14:01:50 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:01.723 14:01:50 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:36:01.723 14:01:50 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:36:01.723 00:36:01.723 real 0m46.842s 00:36:01.723 user 1m1.140s 00:36:01.723 sys 0m13.905s 00:36:01.723 14:01:50 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:01.723 14:01:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:01.723 ************************************ 00:36:01.723 END TEST chaining 00:36:01.723 ************************************ 00:36:01.723 14:01:50 -- common/autotest_common.sh@1142 -- # return 0 00:36:01.723 14:01:50 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:36:01.723 14:01:50 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:36:01.723 14:01:50 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:36:01.723 14:01:50 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:36:01.723 14:01:50 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:36:01.723 14:01:50 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:36:01.723 14:01:50 -- common/autotest_common.sh@722 -- # xtrace_disable 00:36:01.723 14:01:50 -- common/autotest_common.sh@10 -- # set +x 00:36:01.723 14:01:50 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:36:01.723 14:01:50 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:36:01.723 14:01:50 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:36:01.723 14:01:50 -- common/autotest_common.sh@10 -- # set +x 00:36:06.996 INFO: APP EXITING 00:36:06.996 INFO: killing all VMs 00:36:06.996 INFO: killing vhost app 00:36:06.996 INFO: EXIT DONE 00:36:10.282 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:10.282 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:10.282 Waiting for block devices as requested 00:36:10.282 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:36:10.282 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:10.282 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:10.282 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:10.541 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:10.541 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:10.541 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:10.800 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:10.800 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:10.800 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:11.059 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:11.059 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:11.059 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:11.318 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:11.318 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:11.318 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:11.577 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:15.791 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:15.791 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:15.791 Cleaning 00:36:15.791 Removing: /var/run/dpdk/spdk0/config 00:36:15.791 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:36:15.791 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:36:15.791 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:36:15.791 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:36:15.791 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:36:15.791 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:36:15.791 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:36:15.791 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:36:15.791 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:36:15.791 Removing: /var/run/dpdk/spdk0/hugepage_info 00:36:15.791 Removing: /dev/shm/nvmf_trace.0 00:36:15.791 Removing: /dev/shm/spdk_tgt_trace.pid385600 00:36:15.791 Removing: /var/run/dpdk/spdk0 00:36:15.791 Removing: /var/run/dpdk/spdk_pid384752 00:36:15.791 Removing: /var/run/dpdk/spdk_pid385600 00:36:15.791 Removing: /var/run/dpdk/spdk_pid386195 00:36:15.791 Removing: /var/run/dpdk/spdk_pid386981 00:36:15.791 Removing: /var/run/dpdk/spdk_pid387205 00:36:15.791 Removing: /var/run/dpdk/spdk_pid387960 00:36:15.791 Removing: /var/run/dpdk/spdk_pid388139 00:36:15.791 Removing: /var/run/dpdk/spdk_pid388431 00:36:15.791 Removing: /var/run/dpdk/spdk_pid391041 00:36:15.791 Removing: /var/run/dpdk/spdk_pid392395 00:36:15.791 Removing: /var/run/dpdk/spdk_pid392618 00:36:15.791 Removing: /var/run/dpdk/spdk_pid392934 00:36:15.791 Removing: /var/run/dpdk/spdk_pid393250 00:36:15.791 Removing: /var/run/dpdk/spdk_pid393501 00:36:15.791 Removing: /var/run/dpdk/spdk_pid393705 00:36:15.791 Removing: /var/run/dpdk/spdk_pid393899 00:36:15.791 Removing: /var/run/dpdk/spdk_pid394123 00:36:15.791 Removing: /var/run/dpdk/spdk_pid394874 00:36:15.791 Removing: /var/run/dpdk/spdk_pid397963 00:36:15.791 Removing: /var/run/dpdk/spdk_pid398272 00:36:15.791 Removing: /var/run/dpdk/spdk_pid398506 00:36:15.791 Removing: /var/run/dpdk/spdk_pid398842 00:36:15.791 Removing: /var/run/dpdk/spdk_pid398909 00:36:15.791 Removing: /var/run/dpdk/spdk_pid399120 00:36:15.791 Removing: /var/run/dpdk/spdk_pid399337 00:36:15.791 Removing: /var/run/dpdk/spdk_pid399528 00:36:15.791 Removing: /var/run/dpdk/spdk_pid399734 00:36:15.791 Removing: /var/run/dpdk/spdk_pid399941 00:36:15.791 Removing: /var/run/dpdk/spdk_pid400247 00:36:15.791 Removing: /var/run/dpdk/spdk_pid400482 00:36:15.791 Removing: /var/run/dpdk/spdk_pid400682 00:36:15.791 Removing: /var/run/dpdk/spdk_pid400879 00:36:15.791 Removing: /var/run/dpdk/spdk_pid401081 00:36:15.791 Removing: /var/run/dpdk/spdk_pid401337 00:36:15.791 Removing: /var/run/dpdk/spdk_pid401632 00:36:15.791 Removing: /var/run/dpdk/spdk_pid401823 00:36:15.791 Removing: /var/run/dpdk/spdk_pid402025 00:36:15.791 Removing: /var/run/dpdk/spdk_pid402218 00:36:15.791 Removing: /var/run/dpdk/spdk_pid402422 00:36:15.791 Removing: /var/run/dpdk/spdk_pid402734 00:36:15.791 Removing: /var/run/dpdk/spdk_pid402972 00:36:15.791 Removing: /var/run/dpdk/spdk_pid403175 00:36:15.791 Removing: /var/run/dpdk/spdk_pid403369 00:36:15.791 Removing: /var/run/dpdk/spdk_pid403569 00:36:15.791 Removing: /var/run/dpdk/spdk_pid403926 00:36:15.791 Removing: /var/run/dpdk/spdk_pid404216 00:36:15.791 Removing: /var/run/dpdk/spdk_pid404497 00:36:15.791 Removing: /var/run/dpdk/spdk_pid404867 00:36:15.791 Removing: /var/run/dpdk/spdk_pid405236 00:36:15.791 Removing: /var/run/dpdk/spdk_pid405442 00:36:15.791 Removing: /var/run/dpdk/spdk_pid405805 00:36:15.791 Removing: /var/run/dpdk/spdk_pid406190 00:36:15.791 Removing: /var/run/dpdk/spdk_pid406323 00:36:15.791 Removing: /var/run/dpdk/spdk_pid406680 00:36:15.791 Removing: /var/run/dpdk/spdk_pid407148 00:36:15.791 Removing: /var/run/dpdk/spdk_pid407445 00:36:15.791 Removing: /var/run/dpdk/spdk_pid407553 00:36:15.791 Removing: /var/run/dpdk/spdk_pid411886 00:36:15.791 Removing: /var/run/dpdk/spdk_pid413593 00:36:15.791 Removing: /var/run/dpdk/spdk_pid415211 00:36:15.791 Removing: /var/run/dpdk/spdk_pid416019 00:36:15.791 Removing: /var/run/dpdk/spdk_pid417255 00:36:16.066 Removing: /var/run/dpdk/spdk_pid417508 00:36:16.066 Removing: /var/run/dpdk/spdk_pid417641 00:36:16.066 Removing: /var/run/dpdk/spdk_pid417672 00:36:16.066 Removing: /var/run/dpdk/spdk_pid421490 00:36:16.066 Removing: /var/run/dpdk/spdk_pid422016 00:36:16.066 Removing: /var/run/dpdk/spdk_pid423134 00:36:16.066 Removing: /var/run/dpdk/spdk_pid423397 00:36:16.066 Removing: /var/run/dpdk/spdk_pid429237 00:36:16.066 Removing: /var/run/dpdk/spdk_pid430994 00:36:16.066 Removing: /var/run/dpdk/spdk_pid431865 00:36:16.066 Removing: /var/run/dpdk/spdk_pid435932 00:36:16.066 Removing: /var/run/dpdk/spdk_pid437900 00:36:16.066 Removing: /var/run/dpdk/spdk_pid438707 00:36:16.066 Removing: /var/run/dpdk/spdk_pid442961 00:36:16.066 Removing: /var/run/dpdk/spdk_pid445850 00:36:16.066 Removing: /var/run/dpdk/spdk_pid446722 00:36:16.066 Removing: /var/run/dpdk/spdk_pid457290 00:36:16.066 Removing: /var/run/dpdk/spdk_pid459705 00:36:16.066 Removing: /var/run/dpdk/spdk_pid460685 00:36:16.066 Removing: /var/run/dpdk/spdk_pid470265 00:36:16.066 Removing: /var/run/dpdk/spdk_pid472484 00:36:16.066 Removing: /var/run/dpdk/spdk_pid473607 00:36:16.066 Removing: /var/run/dpdk/spdk_pid483693 00:36:16.066 Removing: /var/run/dpdk/spdk_pid487292 00:36:16.066 Removing: /var/run/dpdk/spdk_pid488614 00:36:16.066 Removing: /var/run/dpdk/spdk_pid500042 00:36:16.066 Removing: /var/run/dpdk/spdk_pid502680 00:36:16.066 Removing: /var/run/dpdk/spdk_pid503843 00:36:16.066 Removing: /var/run/dpdk/spdk_pid515867 00:36:16.066 Removing: /var/run/dpdk/spdk_pid518408 00:36:16.066 Removing: /var/run/dpdk/spdk_pid519912 00:36:16.066 Removing: /var/run/dpdk/spdk_pid531178 00:36:16.066 Removing: /var/run/dpdk/spdk_pid535872 00:36:16.066 Removing: /var/run/dpdk/spdk_pid536853 00:36:16.066 Removing: /var/run/dpdk/spdk_pid538511 00:36:16.066 Removing: /var/run/dpdk/spdk_pid541740 00:36:16.066 Removing: /var/run/dpdk/spdk_pid546859 00:36:16.066 Removing: /var/run/dpdk/spdk_pid549390 00:36:16.066 Removing: /var/run/dpdk/spdk_pid553759 00:36:16.066 Removing: /var/run/dpdk/spdk_pid557149 00:36:16.066 Removing: /var/run/dpdk/spdk_pid563191 00:36:16.066 Removing: /var/run/dpdk/spdk_pid566610 00:36:16.066 Removing: /var/run/dpdk/spdk_pid573248 00:36:16.066 Removing: /var/run/dpdk/spdk_pid575493 00:36:16.066 Removing: /var/run/dpdk/spdk_pid581757 00:36:16.066 Removing: /var/run/dpdk/spdk_pid584391 00:36:16.066 Removing: /var/run/dpdk/spdk_pid590646 00:36:16.066 Removing: /var/run/dpdk/spdk_pid593074 00:36:16.066 Removing: /var/run/dpdk/spdk_pid597562 00:36:16.066 Removing: /var/run/dpdk/spdk_pid597921 00:36:16.066 Removing: /var/run/dpdk/spdk_pid598273 00:36:16.066 Removing: /var/run/dpdk/spdk_pid598636 00:36:16.066 Removing: /var/run/dpdk/spdk_pid599074 00:36:16.066 Removing: /var/run/dpdk/spdk_pid599847 00:36:16.066 Removing: /var/run/dpdk/spdk_pid600683 00:36:16.066 Removing: /var/run/dpdk/spdk_pid601124 00:36:16.067 Removing: /var/run/dpdk/spdk_pid602740 00:36:16.067 Removing: /var/run/dpdk/spdk_pid604446 00:36:16.067 Removing: /var/run/dpdk/spdk_pid606111 00:36:16.067 Removing: /var/run/dpdk/spdk_pid607418 00:36:16.067 Removing: /var/run/dpdk/spdk_pid609486 00:36:16.067 Removing: /var/run/dpdk/spdk_pid611407 00:36:16.067 Removing: /var/run/dpdk/spdk_pid613246 00:36:16.067 Removing: /var/run/dpdk/spdk_pid614543 00:36:16.067 Removing: /var/run/dpdk/spdk_pid615093 00:36:16.067 Removing: /var/run/dpdk/spdk_pid615548 00:36:16.325 Removing: /var/run/dpdk/spdk_pid617629 00:36:16.325 Removing: /var/run/dpdk/spdk_pid619486 00:36:16.325 Removing: /var/run/dpdk/spdk_pid621317 00:36:16.325 Removing: /var/run/dpdk/spdk_pid622379 00:36:16.325 Removing: /var/run/dpdk/spdk_pid623606 00:36:16.325 Removing: /var/run/dpdk/spdk_pid624147 00:36:16.325 Removing: /var/run/dpdk/spdk_pid624238 00:36:16.325 Removing: /var/run/dpdk/spdk_pid624399 00:36:16.325 Removing: /var/run/dpdk/spdk_pid624614 00:36:16.325 Removing: /var/run/dpdk/spdk_pid624793 00:36:16.325 Removing: /var/run/dpdk/spdk_pid626044 00:36:16.325 Removing: /var/run/dpdk/spdk_pid627549 00:36:16.325 Removing: /var/run/dpdk/spdk_pid629048 00:36:16.325 Removing: /var/run/dpdk/spdk_pid629766 00:36:16.326 Removing: /var/run/dpdk/spdk_pid630643 00:36:16.326 Removing: /var/run/dpdk/spdk_pid630847 00:36:16.326 Removing: /var/run/dpdk/spdk_pid630915 00:36:16.326 Removing: /var/run/dpdk/spdk_pid631056 00:36:16.326 Removing: /var/run/dpdk/spdk_pid631987 00:36:16.326 Removing: /var/run/dpdk/spdk_pid632452 00:36:16.326 Removing: /var/run/dpdk/spdk_pid632912 00:36:16.326 Removing: /var/run/dpdk/spdk_pid635735 00:36:16.326 Removing: /var/run/dpdk/spdk_pid637536 00:36:16.326 Removing: /var/run/dpdk/spdk_pid639290 00:36:16.326 Removing: /var/run/dpdk/spdk_pid640354 00:36:16.326 Removing: /var/run/dpdk/spdk_pid641585 00:36:16.326 Removing: /var/run/dpdk/spdk_pid642128 00:36:16.326 Removing: /var/run/dpdk/spdk_pid642267 00:36:16.326 Removing: /var/run/dpdk/spdk_pid646220 00:36:16.326 Removing: /var/run/dpdk/spdk_pid646430 00:36:16.326 Removing: /var/run/dpdk/spdk_pid646624 00:36:16.326 Removing: /var/run/dpdk/spdk_pid646665 00:36:16.326 Removing: /var/run/dpdk/spdk_pid646870 00:36:16.326 Removing: /var/run/dpdk/spdk_pid647080 00:36:16.326 Removing: /var/run/dpdk/spdk_pid647956 00:36:16.326 Removing: /var/run/dpdk/spdk_pid649260 00:36:16.326 Removing: /var/run/dpdk/spdk_pid650327 00:36:16.326 Clean 00:36:16.584 14:02:04 -- common/autotest_common.sh@1451 -- # return 0 00:36:16.584 14:02:04 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:36:16.584 14:02:04 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:16.584 14:02:04 -- common/autotest_common.sh@10 -- # set +x 00:36:16.584 14:02:04 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:36:16.584 14:02:04 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:16.584 14:02:04 -- common/autotest_common.sh@10 -- # set +x 00:36:16.584 14:02:05 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:16.585 14:02:05 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:36:16.585 14:02:05 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:36:16.585 14:02:05 -- spdk/autotest.sh@391 -- # hash lcov 00:36:16.585 14:02:05 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:36:16.585 14:02:05 -- spdk/autotest.sh@393 -- # hostname 00:36:16.585 14:02:05 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:36:16.843 geninfo: WARNING: invalid characters removed from testname! 00:36:48.925 14:02:32 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:48.925 14:02:35 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:49.863 14:02:38 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:53.154 14:02:41 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:55.690 14:02:43 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:58.226 14:02:46 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:00.762 14:02:48 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:37:00.762 14:02:48 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:37:00.762 14:02:48 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:37:00.762 14:02:48 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:37:00.762 14:02:48 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:37:00.762 14:02:48 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:00.762 14:02:48 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:00.762 14:02:48 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:00.762 14:02:48 -- paths/export.sh@5 -- $ export PATH 00:37:00.762 14:02:48 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:00.762 14:02:48 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:00.762 14:02:48 -- common/autobuild_common.sh@444 -- $ date +%s 00:37:00.762 14:02:48 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720785768.XXXXXX 00:37:00.762 14:02:48 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720785768.p7nf3q 00:37:00.762 14:02:48 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:37:00.762 14:02:48 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:37:00.762 14:02:48 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:37:00.762 14:02:48 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:37:00.762 14:02:48 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:37:00.762 14:02:48 -- common/autobuild_common.sh@460 -- $ get_config_params 00:37:00.762 14:02:48 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:37:00.762 14:02:48 -- common/autotest_common.sh@10 -- $ set +x 00:37:00.762 14:02:49 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:37:00.762 14:02:49 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:37:00.762 14:02:49 -- pm/common@17 -- $ local monitor 00:37:00.762 14:02:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:00.762 14:02:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:00.762 14:02:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:00.762 14:02:49 -- pm/common@21 -- $ date +%s 00:37:00.762 14:02:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:00.762 14:02:49 -- pm/common@21 -- $ date +%s 00:37:00.762 14:02:49 -- pm/common@25 -- $ sleep 1 00:37:00.762 14:02:49 -- pm/common@21 -- $ date +%s 00:37:00.762 14:02:49 -- pm/common@21 -- $ date +%s 00:37:00.762 14:02:49 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720785769 00:37:00.762 14:02:49 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720785769 00:37:00.762 14:02:49 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720785769 00:37:00.762 14:02:49 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720785769 00:37:00.762 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720785769_collect-vmstat.pm.log 00:37:00.762 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720785769_collect-cpu-load.pm.log 00:37:00.762 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720785769_collect-cpu-temp.pm.log 00:37:00.762 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720785769_collect-bmc-pm.bmc.pm.log 00:37:01.702 14:02:50 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:37:01.702 14:02:50 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:37:01.702 14:02:50 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:01.702 14:02:50 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:37:01.702 14:02:50 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:37:01.702 14:02:50 -- spdk/autopackage.sh@19 -- $ timing_finish 00:37:01.702 14:02:50 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:37:01.702 14:02:50 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:37:01.702 14:02:50 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:37:01.702 14:02:50 -- spdk/autopackage.sh@20 -- $ exit 0 00:37:01.702 14:02:50 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:37:01.702 14:02:50 -- pm/common@29 -- $ signal_monitor_resources TERM 00:37:01.702 14:02:50 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:37:01.702 14:02:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:01.702 14:02:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:37:01.702 14:02:50 -- pm/common@44 -- $ pid=661064 00:37:01.702 14:02:50 -- pm/common@50 -- $ kill -TERM 661064 00:37:01.702 14:02:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:01.702 14:02:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:37:01.702 14:02:50 -- pm/common@44 -- $ pid=661066 00:37:01.702 14:02:50 -- pm/common@50 -- $ kill -TERM 661066 00:37:01.702 14:02:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:01.702 14:02:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:37:01.702 14:02:50 -- pm/common@44 -- $ pid=661068 00:37:01.702 14:02:50 -- pm/common@50 -- $ kill -TERM 661068 00:37:01.702 14:02:50 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:01.702 14:02:50 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:37:01.702 14:02:50 -- pm/common@44 -- $ pid=661092 00:37:01.702 14:02:50 -- pm/common@50 -- $ sudo -E kill -TERM 661092 00:37:01.702 + [[ -n 273415 ]] 00:37:01.702 + sudo kill 273415 00:37:01.712 [Pipeline] } 00:37:01.730 [Pipeline] // stage 00:37:01.735 [Pipeline] } 00:37:01.752 [Pipeline] // timeout 00:37:01.757 [Pipeline] } 00:37:01.774 [Pipeline] // catchError 00:37:01.780 [Pipeline] } 00:37:01.797 [Pipeline] // wrap 00:37:01.804 [Pipeline] } 00:37:01.819 [Pipeline] // catchError 00:37:01.829 [Pipeline] stage 00:37:01.832 [Pipeline] { (Epilogue) 00:37:01.846 [Pipeline] catchError 00:37:01.848 [Pipeline] { 00:37:01.863 [Pipeline] echo 00:37:01.865 Cleanup processes 00:37:01.871 [Pipeline] sh 00:37:02.154 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:02.155 661171 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:37:02.155 661387 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:02.169 [Pipeline] sh 00:37:02.454 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:02.454 ++ grep -v 'sudo pgrep' 00:37:02.454 ++ awk '{print $1}' 00:37:02.454 + sudo kill -9 661171 00:37:02.465 [Pipeline] sh 00:37:02.748 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:37:15.000 [Pipeline] sh 00:37:15.285 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:37:15.286 Artifacts sizes are good 00:37:15.301 [Pipeline] archiveArtifacts 00:37:15.308 Archiving artifacts 00:37:15.490 [Pipeline] sh 00:37:15.774 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:37:15.787 [Pipeline] cleanWs 00:37:15.796 [WS-CLEANUP] Deleting project workspace... 00:37:15.796 [WS-CLEANUP] Deferred wipeout is used... 00:37:15.802 [WS-CLEANUP] done 00:37:15.804 [Pipeline] } 00:37:15.817 [Pipeline] // catchError 00:37:15.828 [Pipeline] sh 00:37:16.110 + logger -p user.info -t JENKINS-CI 00:37:16.118 [Pipeline] } 00:37:16.134 [Pipeline] // stage 00:37:16.139 [Pipeline] } 00:37:16.156 [Pipeline] // node 00:37:16.160 [Pipeline] End of Pipeline 00:37:16.184 Finished: SUCCESS